Byte, feberero del 86

Seguimos con el proyecto mensual de ojear la revista Byte… con cuarenta años de retraso (tenéis todas las entradas sobre el tema, que ya son unas cuantas, en la etiqueta Byte de este blog). Y febrero del 86 se dedicaba… al procesado de textos (que, spoiler, no es lo mismo que los procesadores de texto).

Portada de la revista Byte de abril de 1986. El tema es el procesado de textos. La ilustración de portada es una placa de ordenador sobre la que flota la palabra TEXT

Y comenzamos mirando publicidad. El primer anuncio, diría yo, de un programita que seguimos usando cuarenta años más tarde: ¡Excel! Dice la Wikipedia que fue lanzado en septiembre del 85, y si vais a nuestra entrada del número de mayo del 85 (sí, llevamos ya un tiempín con esta historia de la revista Byte) encontraréis el anuncio de que lo iban a lanzar, y corregidme si me equivoco (ya podría ser, ya), pero no lo habíamos vuelto a ver por aquí.

Anuncio a doble página de Microsoft Excel. Vemos un ratón con un único botón y un diskette de tres pulgadas y media

Y si os ha llamado la atención el ratón monobotón, o el disquete de 3,5″… sí, Microsoft lanzó originalmente Excel solo para Mac.

No pongo captura, pero también merece la pena pararse en la sección de cartas (página 24 y siguientes), en que los lectores revisan el programa para calcular π (¡del número de mayo!) y explican lo lentísimo que es convergiendo (pero destacan que es muy legible y un buen ejemplo para aprender) y algunas correcciones al programa sobre la distribución normal (esta vez solo tenemos que retroceder hasta octubre). Bravo por los lectores atentos.

Seguimos, esta vez con nuestra manía de pararnos en cualquier cosa que tenga que ver con el Amiga. En este caso, se trata de una introducción al Kernel, el software de sistema contenido en su ROM, escrita nada más y nada menos que por su creador, el mítico (en círculos reducidos, cierto) RJ Mical. Si alguien quiere leer más sobre el tema, en el mismo Archive podéis encontrar su manual. #YaNoSeEscribeSoftwareAsí

Introduction to the Amiga ROM Kernel

A look inside the Amiga by the creator of Intuition

Editor's note: The first version of this article appeared on BIX (BYTE Information Exchange) on October 10, 1985.

This article introduces the building blocks of the Amiga ROM (read-only memory) Kernel software. I will examine the ROM Kernel including AmigaDOS and the disk-based libraries and devices, and present examples of translating code from other machines to the Amiga. Finally, I'll look at the hardware and special features of the ROM Kernel, describing how to use these directly in a system-integrated fashion. | Editor's note: For an overview of the Amiga from Commodore, see "The Amiga Personal Computer" by Gregg Williams, )on Edwards, and Phillip Robinson. August 1985 BYTE, page 83.)

System Overview

It is rare for software and hardware groups to work as closely together as we did at Amiga. We exchanged and debated ideas continuously during the creation of the Amiga. The close relationship influenced the design, bringing new features to the hardware and allowing the software to take full advantage of the hardware.

The Amiga's greatest strengths lie in its modularity and the interconnections among its system components, both hardware and software. The design teams designed and devel-

oped simultaneously and from the start they were intended to complement one another. Even though we designed the hardware pieces to fit tightly together, you can use any subset of the features without the necessity of controlling the entire machine. It's the same with the ROM software, where the pieces work closely together but each can stand alone.

The hardware and software combine efforts in many ways to achieve the Amiga's performance. For instance, the hardware includes a special coprocessor, the Copper, which synchronizes itself to the display position of the video beam without tying up the bus or the processor. The Copper can move data to one of the many hardware registers or it can cause a 68000 interrupt, which the Amiga's multitasking Exec (also known as Executive) then processes. This makes the Copper a powerful, unobtrusive auxiliary tool. It is used by the Graphics Support library for display-oriented changes and by the audio device for time-critical audio channel manipulations. You can use the Copper for time-critical operations because it's tied to the display, which is guaranteed to run at 60 Hz (the display processors start from the top of the screen 60 times a second).

The way the Amiga handles communications with its peripherals is another example of the union of hardware and software. The signals that pass between the Amiga and its peripherals are interrupt-driven. Peripherals, therefore, do not disturb the system or require monitoring until information needs to be communicated. The Amiga Exec works with the interrupt-driven communication by managing a complete interrupt-processing mechanism, providing a convenient, interleaved, prioritized processing of interrupts.

The multitasking Exec forms the core of the system software; it is a compact collection of routines that underlies the rest of the Amiga ROM software. The developers attempted to optimize the Exec for space, performance, clarity of usage, and the creation and management of lists, which are the primary components of Exec. All of the other pieces of the Exec are built on lists and, therefore, provide performance with a minimum of system overhead. You will be able to use even the more esoteric Exec functions once you learn the concept of the Exec list.

Exec is the starting point for all the other pieces of ROM software, mostly because it is the controller of tasks and interrupts. Each of the ROM , Kernel software components is designed to stand alone as much as possible; programmers can choose which components to use. But at the...

Y unas páginas más adelante nos encontramos un anuncio del Amiga que es un homenaje (merecidísimo) a Denise, Paula y Agnus, los tres chips especializados en vídeo, audio y gestión de memoria, revolucionarios para la época, que eran una de las partes vitales para hacer del Amiga la maravilla multimedia que era.

Anuncio del Amiga de Commodore. se muestran tres chips, y se presume de 4096 colores, sonido de cuatro canales estéreo, 32 instrumentos, 8 sprites, animaciíon en 3D, 25 canales DMA, un bit blitter y voces masculina y femenina

Y dejamos el Amiga (hasta que nos den la más mínima oportunidad de recuperar el tema 😅) y entramos en el tema del número, el procesado de textos. Hablando con la leyenda de la informática que es Donald Knuth (se lee Kanuz, por cierto), hoy profesor emérito de Stanford, creador de TeX y autor de la magna opus The Art of Computer Programming (in progress). Por aquella época ya hacía más de una década que le habían dado el premio Turing y en la entrevista, como no podría ser de otra forma dado el tema, hablan de tipografía digital y de la creación de Metafont, un software que se sigue usando hoy y que continúa siendo una [no tan] pequeña maravilla.

COMPUTER SCIENCE CONSIDERATIONS

CONDUCTED BY G. MICHAEL VOSE AND GREGG WILLIAMS

Donald Knuth speaks on his involvement with digital typography

Text processing as a computer science problem has consumed a major portion of the time and energy of Stanford professor Donald Knuth over the past eight years. Knuth authored and placed into the public domain a highly regarded typography system that he calls TeX {pronounced "tech"), along with a font creation language called METAFONT. \n conjunction with the completion of T^X, Knuth and Addison-Wesley are publishing a five-volume work entitled Computers and Typesetting. Volume I is The TeXbook, volume 2 is the source code for TeX, volume 3 is The METAFONT Book, volume 4 is the METAFONT source code, and volume 5 is Computer Modern Typefaces.

To discover what so intrigued Knuth about this subject. BYTE senior editors Gregg Williams and Mike Vose conducted the following interview with Professor Knuth at Addison-VJesley's offices in Reading, Massachusetts, on November II, 1985.

BYTE: Dr. Knuth. how did you become involved with digital typography and the publicdomain system known as Tj:X? Knuth: I got interested because I had written books and seen galley proofs, and suddenly computers were getting into the field of typesetting and the quality was going down.

Then I was working on a committee at Stanford planning an exam, and we got a hold of some drafts of Patrick Winston's book on artificial intelligence. We were looking at it to see if we should put it on the reading list for a comprehensive exam. It had just been brought in from Los Angeles where it had been done on a digital phototypesetter. This was the first time that I had ever seen digital type at high resolution. We had a cheap digital machine at Stanford that we thought of as a new toy. But never would I have associated it with printing a book that I'd be proud to own. Then I saw this type, and it looked as good as any I had ever seen done with metal. I knew that it was done just with zeroes and ones. I knew that it was bits. I could never, in my mind, ever, conceive of doing anything with lenses or with lead, metallurgy, and things like that. But zeroes and ones was different. I felt that I understood zeroes and ones as well as anybody! All it involved was getting the right zeroes and ones in place and I would have a machine that would do the books and solve all the quality problems. And, also, I could do it once and for all. I still had a few more volumes to write [of his seminal work. The Art of Computer Programming, a seven-volume series of which three volumes are finished] and

Y, para hacer más énfasis en lo que decía de que procesado de texto no se refiere a los procesadores de texto (al menos, no a los que nos vienen más rápidamente a la cabeza), nos podemos dar un chapuzón en cómo estaba por aquel entonces el estado del arte de la interpretación del lenguaje natural:

INTERPRETATION OF NATURAL LANGUAGE

by Jordan Pollack and David L Waltz

A potential application of parallelism

This article was adapted from "Parallel Interpretation of Natural language!' presented to the International Conference on Fifth Generation Computer Systems, November 1984.

THE INTERPRETATION of natural language requires the cooperative application of both language-specific knowledge about word use, word order, and phrase structure and realworld knowledge about typical situations, events, roles, contexts, and so on. While these areas of knowledge seem distinct, it isn't easy to write a program for natural-language processing that decomposes language into its parts; i.e., you cannot construct a psychologically realistic naturallanguage processor by merely conjoining various knowledge-specific processing modules serially or hierarchically.

We offer instead a model based on the integration of independent syntactic, semantic, and contextual knowledge sources via spreading activation and lateral inhibition links. Figure 1 shows part of the network that is activated with the sentence

John shot some bucks. (1)

Links with arrows are activating, while those with circles are inhibiting. Mutual inhibition links between two nodes allow only one of the nodes to remain active for any duration. (However, both nodes may be simultaneously inactive.) Mutual inhibition links are generally placed between nodes that represent mutually incompatible interpretations, while mutual activation links join compatible ones. If the context in which this sentence occurs has included a reference to "gambling." only the shaded nodes of figure la remain active after relaxation of the network. But if "hunting" has been primed, only the shaded nodes shown in figure lb will remain active. Notice that the "decision" made by the system integrates syntactic, semantic, and contextual knowledge: The fact that "some bucks" is a legal noun phrase is a factor in killing the readings of "bucks" as a verb; the fact that "hunting" is associated with both the "fire" meaning of "shot" and the "deer" meaning of "bucks" leads to the activation of the coalition of nodes shown in figure lb; and so on. At the same time, the knowledge base in our model is easy to add to or modify. In this model of processing, decisions are spread out over time, allowing various knowledge sources to be brought to bear on the elements of the interpretation process. This is a radical departure from cognitive models based on the convenient decision procedures provided by conventional programming languages.

Our program operates by dynamically constructing a graph with weighted nodes and links from a sentence while running an iterative operation that recomputes each node's activation level (or weight) based on a function of its current value and the inner product of its links---

(Como es costumbre de la casa, tanto Pollack como Waltz son no solo expertos, sino pioneros en la materia.)

Seguimos con el tema. Nos quejamos (con razón) de que artes y humanidades están excesivamente separadas en las cabezas de muchos, y de que esto es fuente de unos cuantos de nuestros problemas. En los ochenta ya era en gran parte así, no nos engañemos, pero de vez en cuando podíamos ver cosas como un artículo en una revista tecnológica dedicada al tema del procesado de… poesía.

POETRY PROCESSING

by Michael Newman

The concept of artistic freedom takes on new meaning when text processing handles the mundane tasks of prosody

For over a year, Michael Newman, Hillel Chiel (a researcher at Columbia Medical School), and Paul Holier (a programmer and analyst for PaineWebber) have been developing The Poetry Processor: Orpheus A-B-G The software is not yet commercially available, but we are pleased to share Michael Newman's thoughts on poetry processing and a module of Paul Holzer's code that shows off some of the new application's capabilities.

THE PROPERTIES OF a medium can have a decisive impact on the nature of what the medium conveys. Poetry began in an oral bardic tradition. It was newsy, folksy, evocative of the doings of great heroes. It had to be accessible to folk encountered at a roadside as well as pleasurable to more educated people met at court. There was no great emphasis on intricate forms, on how the poem looked on a page, because the page was not where the poem resided. The poem was voice-resident, ear-active. When Gutenberg invented movable type he did more than spring the Bible. His invention ultimately provided a watershed, an opportunity for the consolidation of language itself — and Shakespeare jumped on the opportunity. He reconfigured poetry, bringing together history, tragedy, and comedy under its roof. And, by casting poetry as theatre, he popularized it immensely.

Poetry in print became more permanent, less permutable; more visual, less aural. In this century, with the development of free verse, the poem has become almost a visual object, broken up and spread all over the page. There is even concrete poetry, which makes a fetish of typography.

Another world that makes a fetish of typography is software, specifically the largest part of software: word , processing. Software is about as permanent as print because you can always get a printout, but it is much more permutable. And, above all, it is interactive.

So what will be the impact of this revolutionary new medium on the oldest, most interactive, programmatic, musical, and image-provoking form of human speech? And what will be the impact of poetry on software?

Classical poetic forms— such as the sonnet, the villanelle, the sestina— are natural-language programs, algorithms. The sonnet is a set of instructions specifying 14 lines of iambic pentameter; a line of iambic pentameter contains five iambic units (feet). An iamb is a two-syllable unit with the accent on the second syllable.

Poetic algorithms have more in common with programming than their algorithmicness and use of powerful syntax. Poems involve iteration: Not only do iambs repeat and five-beat lines repeat, but ending-sounds repeat (rhyme in a sonnet), whole lines repeat (refrains and rhymes in a villanelle), words repeat (ending words in a sestina). Individual letters repeat in alliteration. This repetition is something poets count, and something poetry readers see and hear. If poets can count these things, so can a computer. If readers see and hear these things, so can the computer user— in an enhanced way.

Poems also involve two other cornerstones of computer science: recursion and conditionality. Every sonnet written refers to others of its kind. It...

No os perdáis, por favor, la discusión sobre cómo sacar la métrica de un poema automáticamente (en inglés, además, donde la cosa depende más de sílabas átonas y tónicas que en español):

Machine Reading of Metric Verse
by Paul Holzer

A computer can definitively scan a line of poetry for its stress pattern principally in one of two ways: (I) an algorithm can deduce the syllabic structure and the stressed syllables from analysis of the letters that make up the word, or (2) the computer can look up every word in a dictionary database that holds the syllabification and accentuation of every word. The lookup method requires a large database, and the algorithmic approach is complex and requires a deep analysis of English phonetics and spelling.

One of the features of a poetry processor is that the poet-user can specify the meter of every line of a poem (see photo A). For example, the string .-/.-/.-/.-/.-/ represents iambic pentameter. Dots (.) indicate an unstressed syllable and dashes (-) represent a stressed one. The slash (/) indicates the end of a foot, the basic metric unit. The first line of Shakespeare's Sonnet 18

shall I comPARE thee TO a SUMmer's DAY?

is an example of a line of iambic pentameter. The stressed syllables are in uppercase.

After writing a poem, users might request a metric scan of the poem. I will describe here a method fordoing this that is not based on one of the two general solutions I mentioned in the first paragraph. Instead, the processor will break each word into its syllables and then redisplay each line, with each syllable in uppercase or lowercase according to the position of the dots and dashes in a user-specified metric form. So. were Shakespeare trying to compose trochaic pentameter, with the metric pattern -./-./-./-./-./. the processor would reply with

SHALL i COMpare THEE to A sumMER'S day?

He would read this to himself, trying to put the stress on the uppercase syllables. Noting the rhythmic clumsiness, he might rewrite his line as follows:

To a summer's day I shall compare thee

and the processor would respond:

TO a SUMmer's DAY i SHALL comPARE thee.

Sounds better!

The main task for the computer is to break each word into its syllables. The algorithm is based on a systematic application of what appear to be the general rules by which English words break into syllables. Of course, there are no fixed rules, as evidenced by the fact that different dictionaries give different syllabifications for the same word.

The following is a simple version of the algorithm:

1. Break the word up into a sequence of alternating vowel and consonant groupings. Thus microcomputer becomes micro computer. Wherever there is a vowel or group of contiguous vowels, there will be a syllable. We need only assign the neighboring consonants to the syllable on the right or to the syllable on the left.

2. If the first vowel group has a consonant group to its left, then assimilate this consonant group to the vowel group. This leads, in our example, to microcomputer.

3. If the final vowel group has a consonant group to its right, then assimilate this consonant group to the vowel group. We now get microcomput er.

4. For the remaining unassigned consonants, do the following:

. a. If the consonant stands alone, attach it to the following vowel. Thus we get mi cr ocompu ter.

b. If there are two consonants, split them. We get mic ro com pu ter.

c. If there are three consonants, then i. If there is a doubled consonant, split the pair; thus apply becomes a ppl y and finally ap ply.

ii. If there is no doubled consonant, but the first of the three consonants is n, r, or [, then split between the second and third consonants.

iii. In all other cases, split between the first and second consonants.

Before applying this algorithm, however, we must preprocess the initial string of letters in order to take into account certain peculiarities of English orthography:

1. Final e is silent (with certain exceptions); treat it as a special consonant. Thus compute becomes compu te, then compute, and finally compute.

2. Translate many two-letter sequences into special single consonants, e.g.. sh, th, gu, qu. and ck.

3. Identify common suffixes. For example, the algorithm applied to blameless would yield blameless and then bla me less. However, when less is removed as a suffix, then the e in blame to thinking of the program as something for me to use— the relational table of contents was so the user could access my work. The program was originally to have been just a floppy solution to my table-of-contents dilemma. But you don't get that involved in a software application without elaborating and generalizing. In that way software is very much like'

poetic forms. You use it for the sake of using it. It generates its own kind of trance. Poetry and programming, once you look at them in context were just made for each other.

Marriages like this one, made in heaven, often are so because they are marriages of convenience. One of the impediments to formal verse writing is the inconvenience of having to

make repeated book accesses for rhymes, just when the form has prompted some involvement. You stop and look and lose something. That's one reason people have tried to do without forms. But that's throwing out the baby with the bathwater. You don't stop measuring and sounding things out, and you don't abandon would be recognized as silent, yielding blame less.

4. Identify some prefixes. For example, if en is recognized as a prefix, then enact becomes en act, rather than e nact.

It seems to be impossible to come up with a reasonably small set of rules and preprocessing steps to guarantee correct syllabification of all words. Two examples will illustrate some of the inherent difficulties:

1. Compound words: The algorithm will not detect the silent e in snake within the compound word snakebite unless the fragment bite is recognized as a word or treated as a suffix. Avoiding the problem would require either extensive word or prefix table lookups.

2. Successive vowels in different syllables: In reach, the ea is a single vowel sound, and the algorithm would treat it correctly. In react, we pronounce the e and a separately and the correct syllabification is react. Were the algorithm modified to isolate re as a prefix, it would treat react correctly, but turn reach into re ach.

Where ambiguities can arise, the best approach is to formulate a rule that leads to the smallest number of cases requiring table lookups for resolution. The present algorithm is not perfect, but it produces a readable, if not dictionary-perfect, syllabified word 95 percent of the time.

I have provided a Pascal program that implements the syllabification algorithm and illustrates how The Poetry Processor "reads" a user's poem according to a user-specified metric scheme. Editor's note: The Microsoft Pascal source code and executable version are available from BYTEnet Listings, telephone (617) 861-9764. as SCANPOEM.PAS and SCANPOEM.EXE. The executable version requires any MS-DOS or PC-DOS machine] To run the program, prepare two files. TESTPOE must contain the lines of poetry. You can write TEST.POE as a text file with each line of the poem on a separate line. A second text file. TESTFRM. should have a line containing a string of dots (.) and dashes (-) indicating the accentual scheme that each line of poetry is supposed to follow. Slashes indicating the end of a foot are optional.

As an example, a Shakespearean sonnet (iambic pentameter) will have a TESTFRM file consisting of 14 lines of .-/.-/.-/.-/.-/. Each line in TESTFRM must end with an asterisk. After editing the TESTFRM and TESTPOE files, you can run the program by entering its name, SCANPOEM. The computer will "read" the poem, printing in uppercase the appropriately stressed syllables.

Note that the program is a prototype version of the algorithm. It will not handle text with capital letters, apostrophes, or punctuation, so be careful not to include these features in TEST.POE. When using this demonstration program, you will undoubtedly find that some words are not properly syllabified.

Pero el colmo del friquismo, en serio, es un artículo entero dedicado a la sesudísima (solo hago un poco de broma, aquí) cuestión de si vale la pena aprender a teclear en un teclado Dvorak (#TLDR, los autores opinan que sí, si te puedes permitir el lujo de escribir siempre en un teclado Dvorak). Que el primer firmante de la pieza sea profesor emérito… de física, dedicado a la astronomía forense, es solo la guinda del pastel.

¿Había dicho yo que volveríamos al tema Amiga a la que nos dieran una oportunidad? Sí, ¿verdad? Aquí, los orígenes británicos de AmigaDOS:

Tripos—The Roots of AmigaDOS

Metacomco is the British company behind AmigaDOS

by Dick Pountain

A question that must be puzzling many people in U.S. computer circles is, "What is Metacomco?" When Commodore announced its spectacular Amiga computer, much of the U.S. press failed to point out (and possibly did not know) that the advanced operating system AmigaDOS was in fact written by a small British software house called Metacomco. (For more information on the Amiga, see "The Amiga Personal Computer" by Gregg Williams, Jon Edwards, and Phillip Robinson, August 1985 BYTE, page 83.)

Metacomco is based in Bristol, England, a city that is beginning to rival Cambridge as our potential computing capital (it also houses TDI-Pinnacle, INMOS, and others). Metacomco was founded in 1981 by Derek Budge and Bill Meakin and now employs ' about 2 5 people, mainly programmers and other technical staff.

The company's first product was a portable BASIC interpreter written in BCPL, the forerunner of C, which is taught and used extensively at Cambridge University. This interpreter was ported to the 8086 processor and shortly afterward was sold to Digital Research Inc., which still markets its descendant as Personal BASIC. This U.S. link became very important to Metacomco, for the royalties provided a steady source of income during the crucial early years and helped the company establish an office in California, which kept Metacomco in touch with the U.S. computer scene.

In 1983 Dr. Tim King, a Cambridge computer scientist, was engaged by the company as a consultant, and Metacomco's emphasis switched to the 68000 processor, with which King had been working since the first samples came out in 1981. The company produced a series of development tools, also written in BCPL, including a fullscreen editor, a macro assembler, and a linking loader. At that time there was no clearly established standard operating system for the 68000, so the next step was to write one. Subsequently, Tripos was born.

The Tripos operating system was based on a multitasking kernel developed as a doctoral thesis project at Cambridge in 1976. ("Tripos" was the name given to the three-legged stools that students sat on in the old days when taking their examinations and has since become the colloquial name for the Cambridge final examinations.) King, then working at Bath University, took the kernel written for a DEC PDP-11 and made it into a full 3 2 -bit multitasking operating system for the Sage microcomputer (which was new at that time). Tripos is BCPLrbased in the same way that UNIX is C-based, and it has many innovative features that I will discuss.

Metacomco had also purchased the rights to Cambridge LISP, a powerful LISP interpreter/compiler originally developed for the IBM. 3 70 and then ported to the 68000 at Cambridge. Metacomco produced versions for the ill-fated CP/M 68K and then for Tripos. Reduce 3, a symbolic math system written in LISP, was added to produce a Sage-based workstation that was sold to research institutions in various countries. Customers included SORD in Japan and Bristol neighbor INMOS, who used BCPL, for the first stage of bootstrapping its Occam compiler onto the 68000, using Sage computers running Tripos.

In 1984. Tim King joined Metacomco fulltime as Research Director, and Sinclair Research launched the QL. Initially the QL lacked a serious software-development environment, and Metacomco was able to quickly port its development tools, including the BCPL compiler, to it. The company has since extended the range to include an ISO (International Organization for Standardization)-validated Pascal computer, and it markets these products directly, rather than via the manufacturer, largely by mail order.

November 1984 is the crucial date in the AmigaDOS story. Metacomco visited Amiga...

Y aún una página más con contenido Amiga, aunque aquí no sea el contenido lo que quiero destacar, sino el continente. Estamos en 1986, y el mundo comienza a conectarse digitalmente. Byte, de hecho, tiene su propio servicio online, BIX (el Byte Information Exchange), que se había puesto en marcha en junio (a seis dólares de la época la hora de conexión)… pero la audiencia era tan corta (dice la Wikipedia que en el 87 llegaron a 17,000 usuarios) que la revista le daba bombo al servicio destacando un «Best of BIX» en sus páginas. Igual sí hemos cambiado un poco, en estos cuarenta años…

Best of BIX

AMIGA

Commodore's introduction of the Amiga has produced a flurry of activity among professional developers and personal computer users within the Amiga conference. The summary this month includes discussion on cables, monitors, printers, and software fixes. One of the hottest topics in the Amiga conference is on the subject of improving the performance of the Amiga by removing the 68000 and replacing it with a 68010 or 68020.

68010/68020 Upgrade

amiga/amiga68000 #22

An Amiga conference member asked if he could just drop a 68010 into the 68000 socket. This would give a 10 to 80 percent boost in performance! He had one, just sitting up to its bottom in black foam, on the shelf. But there were all these warnings about what would happen to his warranty if he opened the case.

amiga/amiga68000 #26, from rickross [Richard Ross, Eidetic Imaging]

M68010 works! A 68010 plugs directly into the Amiga and no problems were detected in the operation of the system software. Also, for everyone like me who has been trying to judge from the BYTE review photos, the microprocessor is socketed. The performance increase gained by the switch is not phenomenal, and no benchmarks are available, but it did run perceptibly faster. The M68020 has also been tried and seems to work as well.

amiga/amiga68000 #32

A BIX user provides the following:

The company that markets the 68020 piggyback board is Computer System Associates Inc., 7564 Trade St., San Diego, CA 92121, (619) 566-3911. The prices are:

Board only $ 575
Board plus 68020 975
Board plus 68020 and 68881 1480

For more information, contact Patricia Chouinard at the address above. I believe that 68000/68010 supervisor code that handles exceptions and certain other privileged functions will have to be modified. User code should work as is.

amiga/tech.talk #39

An Amiga owner describes his adventure in opening his computer and replacing the CPU:

You just got your Amiga and it's already the slow boy on the block, right? You can plug a 68010 into an Amiga (there goes my warranty) and it does go faster My Sieve benchmark is down to 5.8 seconds from 6.1.

Note: Your warranty will most likely be dead after you do this. Also, there is a lot of RFI shielding inside the Amiga. You get to undo a lot of screws, bend a couple of tabs, and pray a lot. If you aren't a tech type, don't even think about doing this yourself. The 68000 is socketed, but it is partially under the micro-disk drive, so you have to lift it from one end and kind of levitate out the other end (use of your CHI helps). Also, you only take out the screws in the deep wells on the bottom (five in all). Then there are four places where the top grabs the base at the four corners (there were already marks on mine from where it was put together, I guess). Once you have the top off there is a big surprise waiting for you... Another big surprise is that big RFI shield. Yes, it is a $#%+& to get off! There are screws on three sides and two tabs of metal to untwist. Once the shielding is out of the way, your first sight is of the WCS [writable control store] daughterboard. The custom chips and two parallel I/O chips are made with MOS technology.

The CPU is made by Motorola. The main board looks pretty much like the BYTE review photos. The boot ROMs are 27256s! This gives a 32K-byte by 16- bit boot ROM! What are you guys hiding in there? I could put a BASIC interpreter in that much space!

If you attempt to change your CPU, don't blame me if you muff it! If you don't know about how to make yourself static-free, you could really buy yourself some trouble of the worst kind.

Compatibility: I've run all of the Workbench demos. Everything seems fine, but I'm not making any promises. . .

amiga/tech.talk #41

The adventurous Amiga owner says that yes, his Amiga boots up, squeaks and everything! All the software he has runs and works great. The only potential problem at this point is how many times the MOVE SR.dest op code is used. This is the only active op-code difference. There is a whole host of new goodies, though, some that make a . desire for an MC68881 easier to satisfy.

amiga/tech.talk #43: a comment to 39

Another BIX subscriber replied that the upgrade produced only a 5 percent increase in throughput. Perhaps fortunate, because the descriptions of the hardware here have indicated that bus bandwidth consumption by the 68000 is low enough to allow other custom DMA chips to steal enough cycles to get their work done. It would appear that inserting a 68020 in the socket would require faster bimmers, etc.

amiga/tech.talk #44: a comment to 43

Wouldn't think just putting in a 68020 would affect DMA. Same clock speed. Or does the '20 do something different cycle-wise?

amiga/tech.talk #45: a comment to 44

The author of message 43 replied that the 68020 at the same clock speed will finish an instruction or series of instructions internal to the CPU in less time and start requesting the bus for some ROM or RAM access. He assumed that the DMA chips hold a higher bus priority, so the result will be that the 68020 will often be sitting there in idle awaiting the BUSACK signal. Waste of a 68020. Perhaps that explains why there is only a 5 percent 68010 edge over the 68000.

amiga/tech.talk #46: a comment to 45

Somebody said that the 68000 only uses every other clock cycle (for memory access, that is). The DMA hardware is fast enough to do four accesses during every clock cycle. Most of the DMA accesses the bus during periods when the 68000 doesn't. If the 68020 doesn't have these quiet periods then there could be problems.

amiga/tech.talk #47: a comment to 46

Actually, there is a counterargument to that, which is that the 68020, but not the 68010, has an instruction-only cache, which would mean...

Antes de cerrar la sección, quiero aprovechar para recoger el obituario de Robert Tinney en Ars Technica. ¿Quién es Robert Tinney? El ilustrador de muchas de las portadas de los números de Byte que hemos recogido por aquí, que falleció este primero de febrero. Que su obituario aparezca en Ars da una idea tanto de la relevancia de la revista como del impacto visual del trabajo de Tinney en muchísima gente. Curiosamente, estamos muy cerca de llegar a los números en que la revista dejó de emplear a Tinney para pasar a usar fotos en sus portadas, como podéis comprobar en los archivos de la revista Byte en archive.org, que también podéis usar, si queréis, para avanzaros y comprobar de qué va el número «del mes que viene». Añado que Tinney tenía una tienda, todavía activa (y espero que lo siga estando mucho tiempo), y que ahora mismo estoy peleando muy fuerte conmigo mismo para no comprarme pósters del número de artes digitales de 1982, la de abril del 85, o la de «claves de la educación» de, nada más y nada menos que julio de 1980.


Y seguimos también con el repaso a los episodios de marzo del 86 de Computer Chronicles

El primero de los episodios se dedica a operar en bolsa por ordenador, algo novedoso en la época. No me ha resultado especialmente interesante, más allá de los cacharritos para recibir información financiera vía radio FM, tanto en forma de cacharrito independiente como de accesorio para tu PC.

El segundo programa del mes va de «software psicológico», desde software para ayudar con determinadas terapias (con la sofisticación de la época, más cercana al programita con el que se juega para renovar el carnet de conducir) a tests de tipos diversos, con sus, inevitablemente, «módulos de inteligencia artificial»… y las mismas preocupaciones y las mismas salidas por la tangente que nos suenan tanto hoy.

(Y en los breves, noticias de la crisis de Commodore, que le debía doscientos millones de dólares a los bancos. La compañía no acabaría muriendo hasta el 94, pero ya comenzaba a oler a chamusquina la cosa.)

El tercer programa del mes se dedicaba al software para astronomía, tanto profesional como amateur (en este último caso, bastante reconocible para cualquiera que haya usado una app de astronomía únicamente… pero cuatro órdenes de magnitud menos potente e interfaces jurásicas). La discusión sobre astronomía «profesional»… lo de siempre: gente alucinando con lo que había avanzado la tecnología en el campo… que ahora nos parece casi de juguete.

(Y en los breves, la muerte de la mítica Osborne… cincuenta y tres millones de dólares de pérdidas de Commodore, por si los doscientos millones de deuda fuesen poca cosa… y la compra de Pixar por Steve Jobs por «varios millones de dólares».)

El 3×22, dedicado al color, lamentablemente, parece que está desaparecido. Como de costumbre, podéis chafardear lo que se viene en marzo tanto en la lista de episodios de la Wikipedia como en la playlist a la que pertenecen los vídeos de YouTube que tenéis aquí arriba.

Y con esto cerramos el mes. Dentro de unas semanas, más.

Lo de las redes sociales y los menores de 16 (y los tecnobrós)

…o César se mete en un charco al que no le habían llamado.

Si algo tuvo más eco el día en que Pedro Sánchez anunció que se apuntaba a lo de prohibir las redes sociales a los menores de dieciséis, fue la respuesta de Pavel Durov y la contrarréplica de Sánchez. Es por eso que, lo primero es dejar claro que nada más lejos de mi intención que defender a Durov —véase Arrest and indictment of Pavel Durov en la Wikipedia (y, en particular, la sección Background) sobre lo de agosto del 24, por ejemplo, para entender que no es una figura que valga la pena defender— ni a ningún otro tecnobró con red social y patrimonio de 8 cifras en adelante. (Y de las opiniones de meloncete mejor ni hablar, claro.)

Y también debo decir que tampoco estoy en contra de regular el acceso a determinados servicios y contenidos en función de la edad y, en particular, el acceso a redes sociales (y en especial a las grandes redes sociales) para las personas menores de dieciséis. He leído en algún sitio que igual lo que habría que prohibir es el acceso a esas redes a hombres blancos mayores de una cierta edad (incluyéndome a mí), y el argumento tiene bastantes atractivos, pero sigue siendo una restricción por edad.

(Podría ponerme puñetero y buscar casos en que esa regulación tiene efectos negativos, pero no es el objetivo del ejercicio. Lo único que tengo claro de todo esto es que por nada del mundo querría tener nada que ver con la decisión de regular algo así. Ni a punta de pistola.)

Queda, desde luego, definir qué es una red social. Está claro que la medida se dirige a Instagram y TikTok, especialmente. Pero… si un videojuego tiene algún tipo de mecanismo de comunicación entre jugadores (y cuál no lo tiene)… ¿es una red social? Porque me da a mí que (i) lo son, en la práctica, y (ii) si Instagram y TikTok son nocivos para mucha de la gente que los usa, los canales de comunicación de muchos de esos videojuegos lo son al menos igual. ¿Es el correo electrónico una red social? ¿Los SMS del teléfono smart-o-no de niños y niñas a partir de los doce?

Si este blog tiene comentarios (que funcionan como funcionan, lo sé 🙏) y en alguna ocasión (con muy muy poca frecuencia, pero alguna vez ha pasado) se establece comunicación entre quienes dejan comentarios… Entre el primer paréntesis y el segundo, tenéis razón: no pasaría nada por quitar los comentarios de obm, cierto. Pero… ¿vamos a tener que establecer limitaciones de edad en los comentarios de todos los blogs del mundo? Vaya, que, si opero un blog (fuera de WordPress.com y similares) ¿voy a tener que comprobar la edad de cualquiera que publique un comentario? (En obm, me repito, no es problema: se cierran los comentarios y aquí paz y después gloria, pero…) ¿Y las secciones de comentarios de los diarios en línea? (De estas, ciertamente, se puede afirmar que sería una victoria para la sociedad que desaparecieran, tenéis razón.)

También tiene que decirse que, sin haber visto la propuesta de norma y, sobre todo, cómo se pretende hacer la implementación en la práctica, pero después de haber visto algunos ejemplos de cómo se están implantando las normas de restricción de acceso por edad (sobre todo a contenidos para adultos, o, lo que es básicamente lo mismo, pornografía) que van apareciendo por todo el planeta, esas implementaciones me provocan, cuando menos, dudas. La capacidad del estado español de hacer leyes muy bonitas en la teoría, pero cuya aplicación práctica deja bastante que desear, diría yo, ha sido ampliamente demostrada. Y eso que en España tenemos un DNI obligatorio, y eso, al menos en principio, debería facilitar las cosas.

Así que, más allá de definiciones, la principal duda (o mi principal duda, al menos), es sobre esos mecanismos que voy a tener que utilizar para demostrar mi mayoría de edad en aplicaciones como WhatsApp, Telegram y Signal, o en redes sociales como Instagram (sí, querida lectora, me temo que debo confesar que tengo cuenta en Instagram (y Bluesky, y en Mastodon y, si me apuras, en last.fm, que también es una red social)), y cómo me van a garantizar la privacidad de esos datos. En Europa se supone que eso va a hacerse a través de eIDAS 2… pero aún no lo tenemos implementado, no se tendrá hasta finales de este año (y, por todos los dioses, que no corran con el desarrollo, que nos va la privacidad de todas en ello)… y yo, antes de probar un arma de potencial destrucción masiva de la privacidad, preferiría que hicieran unos cuantos meses de pruebas con fuego real antes de verme obligado a usarla.

Adiós a los servidores de Mastodon operados por particulares… Lo que podemos asegurar es que comprobar la edad de las personas que usan un servicio va a ser otro servicio… y va a requerir unos recursos. Seguro que habrá empresas (grandes consultoras, por ejemplo) que se están frotando las manos por ofrecernos esos servicios… a un módico precio. Si hay que pagar por proteger a un colectivo amenazado, se paga, desde luego. Pero me da a mí que esas comunidades virtuales que se sostienen gracias a la buena voluntad de sus operadores y operadoras… van a ver cómo se tensiona aún más esa buena voluntad. Y los que se lo puedan permitir, lo harán pasando por la caja de empresas que no nos caen nada bien.

¿No debería regularse el acceso a redes sociales, pues? Me repito: no estoy en contra de regular el acceso a determinados servicios y contenidos en función de la edad y, en particular, el acceso a redes sociales para las personas menores de dieciséis. Pero, si vamos a hacerlo, o incluso si vamos a aplaudir la medida, como mínimo podríamos intentar informarnos antes de cómo se va a hacer, inventariar los potenciales efectos secundarios que va a tener hacerlo (como los tienen todas las normativas, y la ausencia de normativas) e incluirlo todo en la discusión. Si hay que comprar, se compra. Pero sabiendo el precio.

Byte, enero del 86

Lo de siempre: seguimos con nuestro proyecto de leernos la revista Byte… con cuarenta años de retraso, y esta vez con un añadido final extra. El tema del mes… ¡la robótica! (Tema que vamos a ignorar bastante completamente, porque no me pone nada. Pero las portadas de Byte son un clásico, o sea que aquí va la del mes:

Portada de la revista Byte de enero de 1986. El tema de portada es la robótica. La ilustración es un huevo, que rompe desde dentro un brazo robótico, como si fuera un polluelo al nacer

Comencemos, pues, por la editorial:

A Threat to Future Software

Last October Digital Research Inc. yielded to pressure from Apple and agreed to change its GEM software to decrease its resemblance to Apple Macintosh software. (GEM is an operating environment for several MS-DOS- and PC-DOS-based computers that allows a user to interact with a computer via windows and icons rather than the usual text-only commands.) Let's ignore, for the moment, the uncertain worth of a "visual copyright" (the legal term for Apple's copyrighting of the overall "look" of Macintosh software). Let's also ignore the ethics of Apple's actions. The point to focus on, instead, is that Apple's actions are to no one's benefit: Both the microcomputer industry and Apple itself will suffer from their effects.

Apple's actions will slow the growth of the microcomputer industry, which will hurt Apple by shrinking the potential microcomputer audience. Already, several small companies are worried that some project they're working on (and, often, they with it) will be cut down because it is "too Mac-like." In addition, the success of Apple's tactics may encourage other companies to try similar actions, thus increasing the paralysis and anxiety in the industry.

These actions will stifle the incremental evolution that is at the root of any significant growth in our industry. By "incremental evolution" I mean the process of gradual improvement of a product type that eventually leads to a more robust, useful product. For example, Ashtonlate's Framework did not spring full-blown from the heads of the programming team at Forefront. It had its roots in Dan Bricklin's and Bob Franston's VisiCalc spreadsheet, Sorcim's Supercalc (which added functions and sold to a market not supported by VisiCalc), Mitch Kapor's VisiPlot (which gave the distinctive highlighted menu bar now used in so many programs), the software integration of Lotus 1-2-3, and the icons, windows, and pulldown menus of— well, you get the point. If companies are afraid to go to market with what they think are incremental— but distinct— improvements on a basic design, we will become a stagnant industry bounded by the usual and comfortable.

According to Irving Rappaport. Apple's associate general counsel, Apple's intent is to prevent other companies from creating products that are easy to use because of their similarity to the Macintosh. "If people look at it and say, 'Gee. that's like the Mac— I can operate that,' when that's the result you get, it's over the line" of infringement of Apple's copyrights. The effect of this intent is to fragment the industry in the face of what was becoming a de facto standard for human-computer interaction. This lack of standardization will cause many people to stay uninterested in computers because they will have to relearn basic skills with each brand of computer they encounter. (Imagine how many people would drive cars if car manufacturers used different controls for every function in the car.)

Apple might argue that, by claiming a larger slice of a smaller pie, it will still come out ahead. We believe that it will be hurt directly by its actions and will end up with a smaller piece of a pie that is itself smaller. Apple will, in effect, build a wall around its ghetto of Macintosh products, thus limiting its own growth and encouraging people to "live" elsewhere.

Texas Instruments' TI-99/4A provides a good example. TI announced that it intended to directly profit from all software written for its machine by forcing third-party software developers to publish their products through TI. When a brave few brought out 99/4 cartridges on their own. TI added a proprietary chip to their cartridges that the computer required before it would run the enclosed software. Needless to say, the few developers working on 99/4 software wisely turned to support other computers.

The same may happen to Apple. IBM already sells over half the business computers bought today, and IBM PC-compatibles account for a fairly large slice of what's left. If Apple has been slowing the erosion of its market share to IBM with the Macintosh line (and I think it has), its current moves will alienate software and hardware developers, who will begin to lavish their creativity upon the more congenial IBM PC-compatible marketplace. And where innovation goes, the market will follow.

Consider: IBM made its software and hardware architectures open. It allowed the development of innumerable hardware clones, many far more similar to IBM products than GEM is to the Macintosh desktop; consequently, the IBM PC-compatible market far outdistanced its combined competitors in less than two years. On the other hand, Apple is actively discouraging not only copying but also borrowing from its software design. It claims the sole right to benefit from a set of ideas that Apple itself has borrowed and improved on (the most direct borrowing was from work done at Xerox PARC). Given these two opposing directions, what do you think will happen?

A Call to Action

We at BYTE call on Apple to recognize the long-term implications of its actions and limit itself to prosecuting cases where the alleged theft is not of "looks" but of actual program code. Barring that, we call on Apple to license its allegedly copyrightable interface to markets that do not directly compete with its current or planned product line— if the licensing fees are reasonable, everyone will profit.

If neither of these things happen, we call on the judicial system to hand down rulings that reflect a strict interpretation of the visual copyright laws— that is. that a product is at fault only if it shows no distinguishing characteristics in appearance or operation from the alleged original; this would protect products that show incremental evolution. We also call on the industry to do two things. The first is to stand up to Apple and see the case decided on its legal merits. The second is to develop an alternative graphic interface and allow its wide adoption throughout the non-Apple computer community; in this way. the rest of us can get on with the business of making computers— in general— good enough that everyone will want to use them.

[Editor's note: Apple maintains that the agreement covers "only three specific products," but one of them is GEM Desktop, which defines the overall GEM environment. Also, according to Kathleen Dixon of Apple, the agreement includes any custom work DRI has done, including the modified GEM software that Atari uses in its 520ST computer] ■ —Gregg Williams, Senior Technical Editor

¿Creíais que Apple se quejaba solo de que Microsoft la copia? (Todo sea dicho: a lo largo de la historia Microsoft ha copiado cosas de Apple… y hasta hay casos en los que Apple ha copiado de Microsoft. Y donde dice Microsoft, puede decirse Google/Android.) Pues antes de quejarse de Microsoft y Windows, se quejaron de GEM, la capa gráfica de Digital Research para sistemas PC/MS-DOS (y no solo estos: volvemos sobre el tema más abajo). Respetando la propiedad intelectual de Apple (más que el editor de Byte, después de leerle), comparto con él que con estas cosas, entonces y ahora, el consumidor sale perdiendo bastante.

Seguimos con los «microbytes» la sección de noticias breves. En esta ocasión, por un lado, evolucionamos con algo que ya habíamos visto por aquí… a las pantallas planas LCD les llega el color:

Epson, Toshiba Announce Color LCDs

Toshiba has developed an active-matrix, eight-color, 640- by 480-pixel, 10-inch-diagonal liquid-crystal display (LCD) that nearly matches the brightness of a standard color TV. No pricing or availability information was given.

Epson announced a backlit high-contrast, 5.13-inch-diagonal color LCD with a resolution of 480 by 440 pixels (one-third of which are red, green, or blue). Epson says the display's contrast ratio is more than 10 times that of a standard reflective LCD and has a viewing angle greater than 60 degrees. Epson also unveiled a high-contrast, 9-inch-diagonal monochrome LCD with a resolution of 640 by 400 pixels. Samples of both displays will be available during the first half of 1986; prices should be approximately twice as much as standard reflective LCDs.

Epson also announced two 10-inch-diagonal monochrome displays using ferroelectric smectic-C crystals. The 640- by 400-pixel and 640- by 200-pixel displays are said to have high contrast ratios, low power consumption, and moderate cost; samples may be available late this year.

Y por el otro (literalmente, hay que girar la página para llegar a ello), desmontamos un poco el mito de que Kodak murió por no innovar en fotografía digital:

Kodak Proposes Tiny Magnetic Disk for Photographs

Eastman Kodak, Rochester. NY, has lined up more than 30 companies— including Sony, Hitachi, and Fuji— to support its 47-mm (1.85-inch) floppy disk for storage of electronic still images. The 800K-byte disk can store up to 50 images of 240-line NTSC video. Eventually, the disk is intended for use in cameras; for now, Kodak is working on a 35-mm film-to-disk transfer station for use in developing labs and a still-video player/recorder for the disks.

…y es que pocas compañías investigaron e invirtieron en el campo de la fotografía digital como Kodak, que acumuló una inmensa bolsa de patentes sobre el tema. Lo que mató a Kodak (bastantes años después de 1986) fue, sobre todo, el miedo a canibalizar su mercado «químico».

Nos vamos, ahora, a la publicidad:

Anuncio del modem Hayes Smartmodem 2400

Sí, amigas, 1986 es el año de volar a 2400 baudios, no a los «viejos» 1200. Casi dos kilobits y medio, sí. ¿Recordáis la tortura que es tener cobertura «solo» 4G y descargar cosas a pocos megabits? (Pero no os emocionéis: no todas las líneas telefónicas de la época soportaban esa barbaridad de velocidad.)

Y seguimos mirando anuncios, con un momento histórico: ¡el primer anuncio que vemos de Windows!

Gran texto, Introducing Powe Windows. Vemos una pantalla de ordenador con quizás 8 colores y cuatro ventanas, que no se solapan, sino que se muestran una al lado de la otra. También vemos un ratón y un disquet de 5 ¼ con la etiqueta Microsoft Windows.

No os pongo el publirreportaje entero (8 páginas tenía en total, que Microsoft ya tenía unos dineros en la época), pero sí os dejo aquí esta maravilla de gráficos:

Doble página con una gran imagen de una captura de pantalla con hasta cinco ventanas mostradas en pantalla, de nuevo sin solaparse. Vemos la aplicación de relog, una ventana con un primitivo explorador de archivos, un "filing assistant" y una gráfica de barras en riguroso blanco y negro.

¿Reconocéis vuestro Windows «de toda la vida»? Yo tampoco.

Hablábamos antes de GEM… y lo recuperamos aquí, porque en este número se analizaba el Atari ST, la tercera de las máquinas con procesador Motorola 68000, después del Macintosh y el Amiga (recordemos siempre: Amiga mejor que ST mejor que Mac). Y el sistema operativo del ST era, efectivamente, el GEM de Digital Research (bueno, GEM era, como con los PCs, la capa gráfica sobre TOS, el verdadero sistema operativo).

The Atari 520ST

The 68000 unbounded

Editor's note: The following is a BYTE product description. It is not a review— for several reasons. Some of the equipment we received, such as the hard-disk drive, were prototypes, and at the time of this writing, software is scarce. Atari has not yet completed its BASIC interpreter, and the operating system. TOS, remains unfinished. Nonetheless, we are as intensely interested as our readership in new technology, and we feel we have learned enough to share some of the results of our investigations. We began our work on this description as soon as we were able to get a system from Atari. A full review will follow in a subsequent issue.

For many years the public has equated the Atari name with arcade games and joysticks. In truth, the Atari 400/800/XL computer line is technically at least comparable if not better than other 8-bit machines, so it should not be a surprise that the company's latest venture, the 520ST (see photo 1), is a competitive 68000 system. Indeed, we are most impressed with the clarity of the graphics, with the speed of the disk I/O (input/output), and with the 520ST's value.

The system is not without its problems. The desktop is less effective than the Macintosh's, the keyboard has an awkward feel, and the current operating system makes it impossible to switch between high-resolution monochrome and low- or mediumresolution color without installing the other monitor and rebooting. Nonetheless, we are left with a very favorable impression; several software-development languages are already available, including FORTH, Modula-2, and C. With them, you can tap the power of the 68000 at a most reasonable price.

System Description

The Atari 520ST is a keyboard computer. Like the Commodore 64 and the Atari 400/800, the 520ST keyboard unit contains the microprocessor, the memory, the video and sound circuitry, and so on. The power supply disk drives, and monitor are external devices. The 520ST has a variety of ports, but there are no internal expansion slots.

The In Brief box on page 90 summarizes the features of the Atari 520ST. For $799, you get the CPU, a 12-inch diagonal monochrome monitor, and one external single-sided double-density floppy-disk drive. For $999, you get the same system with a 12-inch RGB analog monitor in place of the monochrome monitor (see photo I). Both systems provide 51 2 K bytes of RAM (random-access read/ write memory), a Motorola 68000 microprocessor, MIDI ports with a transfer rate of 31,2 50 bps (bits per second), a DMA (direct memory access) port with a transfer rate of 10 megabits per second for a hard disk or CD-ROM (compact-disk read-only memory), and much, much more. To be sure, owners will make some sacrifices. The unit does not have an RF (radio frequency) modulator for television output, every peripheral has a separate power supply (wire haters beware), and the operating system

currently rests in RAM, stealing over 200K bytes from your workspace. We have summarized other problems below, but almost all are insignificant when you consider what you do get for the money. And rest assured, the system works. Our first system, like most of the first production units, had to have several chips reseated. It now functions properly, and we have not heard of any similar quality-control problems on the latest 520STs.

The Hardware Design

The heart of the 520ST is the MC68000, with its 1 6-bit data bus and 24-bit address bus, running at 8 MHz (see figure 1). The rest of the system was designed to stay out of the 68000's way. (See the 520ST motherboard in photo 2.)

The Atari design team began work on the 520ST in May 1984. From the start, they had several specific goals in mind. The first was to choose a fast microprocessor and do everything to let it run effectively at full speed. To the Atari team, that meant maximizing bus bandwidth and relegating as...

Y… ¿vamos a comparar GEM con Windows, tal y como lo presentaba la mismísima Microsoft en su campaña publicitaria?

Dos fotos de pantallas con GEM a media resolución en una y a alta en la otra. La presentación es muchísimo más sofisticada que la de Windows que hemos visto antes, con ventanas que se solapan y los menús del sistema.

(Eso sí: reconoceremos que el parecido con el sistema operativo de los Macintosh es más que notable. Es innegable.)

Seguimos con nuestra sección «esto no lo ponemos en una revista hoy, que nos lapidan» con un programa en BASIC para dibujar superficies 3D:

EASY 3-D GRAPHICS

BY Henning Mittelbach

A BASIC program for plotting 3-D surfaces

AFTER READING "Budget 3-D Graphics" by Tom Clune (March 1985 BYTE, page 240), I decided to develop a low-cost program for three-dimensional graphics on small computers. 

The program is based upon the formulas for an axonometric projection
in relation to the origin, as shown:

XB = X*COS(PHI) - Y*COS(PSI) 
YB = X*SIN(PHI) - Y*SIN(PSI) + Z

Depending on the graphic window of the computer used, you may change these formulas to

XB = XO + X*COS(PHI) -Y*COS(PSI)

YB = YO - X*SIN(PHI) - Y*SIN(PSI) - Z

where XO and YO will represent the origin of the axes, as shown in figure 1. (I developed the program on an Apple II, with XO = 110 and YO = 180.) Also in figure 1, (XB.YB) is the point to be plotted, and PHI and PSI are the angles referring to the horizon. The function Z = F(X,Y), in line 200 of the program, needs a scaling factor F (line 210) that the user has to introduce in the program.

The Program

The program starts at lines 100 to 180 where you set the parameters X0, Y0, ...

Ojo, que el programa tenía una cierta complejidad y hasta ocultaba las superficies ocultas:

Gráficas de las funciones seno por coseno, exponencial del seno de x por y) y equis por y.

(Si esto no os fuera suficiente, os podéis ir a la página 397 para ver cómo implementar el algoritmo de Euclides para calcular el máximo común divisor.)

He dicho que me iba a saltar la robótica, pero sí me quedo con uno de los artículos de la sección:

MACHINE VISION

by Phil Dunbar

An examination of what's new in vision hardware

THE POTENTIAL APPLICATIONS of machine vision are many and obvious. Everything from quality assurance to robotic navigation could benefit from the availability of reliable vision systems for computers. Perhaps less obvious, though, is the variety of problems that hamper development of the technology. These problems appear on all levels of machine vision— hardware, low-level analysis, and high-level AI (artificial intelligence) manipulation of low-level data. This article will discuss problems that plague the development of vision-system hardware and indicate some of the technology that has emerged to address these problems.

You might think that the most difficult hardware problem in vision systems is digitizing the high-frequency analog stream of camera data. In fact, that is not so. Currently, machine vision algorithms use gray-scale (i.e., monochrome intensity) video information almost exclusively. Such information can be adequately extracted from an analog signal by a 6-bit or 8-bit A/D (analog to digital) converter. Real-time conversion requires approximately a 10-MHz conversion rate to digitize a 512- by 512-pixel image.

These rates can be achieved with flash converters, pioneered by the TRW company when it introduced the TDC 1007 in 1977. Flash converters employ (2")-l comparators to perform A/-bit conversions. That is, an 8-bit flash comparator requires 25 5 comparators to operate. Since all possible digitized values can be compared to the signal at once, the throughput is much greater than with successive approximation methods. Of course, the complexity of the converter rises exponentially with linear increases in resolution. Notable among the commercially available flash converters is TRW's 8-bit monolithic chip flash converter (TDC 1048) that can operate at speeds necessary for real-time machine vision applications and costs about $140 per unit. The real problems with vision hardware revolve around the cameras. The problems fall into two basic categories: video signal standards and limitations of particular camera hardware technologies.

Television Standards

Much of robotics suffers from a lack of standards. Machine vision, on the other hand, suffers from the existence

of video signal standards that are not appropriate for our needs. Those standards were created by and for the television industry. Since the entertainment industry is still a far more lucrative market for camera manufacturers than machine vision, few image sensors and cameras deviate from television standards.

The monochrome video signal standard used in the United States, Japan/ and most of the Western Hemisphere is RS-170, a subset of the NTSC (National Television Systems Committee) standard. Europe uses the international CCIR (Consultative Committee, International Radio) standard, which is similar to, but not compatible with, RS-170. Since both standards present essentially the same problems to machine vision applications, I will limit my remarks to the RS-170 standard.

The RS-170 standard defines the composite video and synchronizing signal that your television uses (see figure 1). The image is transmitted one line at a time from top to bottom of...

Y después de la visión venía una pieza dedicada a los sensores táctiles, otra sobre navegación autónoma y una sobre IA en visión por ordenador. De nuevo, uno no sabe si estamos en el 86 o en el 26 (y no se siente con ánimos de explicar a los autores que a la cosa aún le quedaban unas pocas décadas).

Y echamos una última mirada a la publi, y es que creo que no habíamos reflejado por aquí la maravillosa campaña «Charlot» de IBM:

Anuncio a doble página. A la izquierda leemos que el PC ha llvado el rendimiento a una nueva altura. Al la derecha vemos a Charlot sentado  sobre una pila kilomética de documentos de todo tipo, trabajando con un PC de IBM.

Que no fue un único anuncio, os lo aseguro: años, duró la campaña, siempre visualmente maravillosa. Os dejo aquí un recopilatorio de anuncios televisivos.

Y nos vamos a ir con otro momento histórico:

The Acorn RISC Machine

A commercial RISC processor
by Dick Pountain

Acorn Computers Ltd. is one of the U.K.'s most successful computer companies, but like many others, it had its share of financial problems during the depressed year of 1985. Set up in 1 979 by two Sinclair alumni, Chris Curry and Hermann Hauser, the Cambridge-based firm (4a Market Hill, Cambridge CB2 3NJ. England) started out manufacturing a set of modular single-board controllers based on the MOS Technology 6502 processor. These small boards stacked together to make up complete industrial-control systems. The following year the Acorn people launched the Atom personal computer, a packaged but expandable machine that arose out of their experience with 6502 systems. For a while, at around £200, the Atom was the cheapest hobby computer available here, and it attracted a strong following, particularly among those who are as handy with the soldering iron as with the assembler. Hopped-up Atoms can still be found to this day.

Acorn's next product, initially called the Proton, was designed to meet a very advanced—for the time— specification published by the BBC (British Broadcasting Company), which was requesting bids to supply a personal computer around which an educational television series would be produced. Acorn won the contract, after a strong and often acrimonious contest in which Sinclair Research, whose 48K-byte color Spectrum was already on the market, lost out.

After a frustratingly long delay due to quality-control problems with the ULAs (uncommitted logic arrays), the BBC computer was launched and proceeded to corner the market in schools and universities. Acorn became a very wealthy company, with a turnover reputed to be £100,000,000 per annum at its high point.

The BBC Micro (alias the Beeb) is still quite a deluxe machine, with better highresolution color graphics than any of its competitors, and quite a bit faster, thanks to its 2-megahertz 6502. Another plus is the provision of a 10-MHz bus, called the Ttibe, to which second processors can be attached. Acorn charges a lot of money for this sophistication though, and the Beeb has kept its £400 price long after competitors have slashed theirs to below the £200 mark.

Acorn had from the start paid more attention to software than most manufacturers, recruiting the brightest Cambridge University computer science graduates for its software division. As a result, the Beeb acquired a range of languages unrivaled by any machine but the Apple II, including an advanced structured BASIC, LISP, Logo, FORTH, Pascal, BCPL (Basic Combined Programming Language), and more. But despite all these positive points, the Beeb has a major drawback, a shortage of memory. The ambitious specification, combined with the limited addressing capabilities of the 6502, left it with a maximum of 32K bytes of workspace (only this year upgraded to 64K bytes), and in the higher-resolution graphics modes this can be reduced to a mere 8K bytes. That doesn't get you very far in LISP or Logo.

So at the height of its prosperity Acorn set a team to design, in secret, its own processor to replace the 6502. This may seem like an ambitious, even rash, undertaking, but the people on the Acorn team were so wedded to the simplicity and speed of the 6502 architecture that they found it hard to countenance any of the commercially available 16-bit replacements. The BBC operating system is heavily interrupt-driven, and the sluggish interrupt latency of 16-bit chips, such as the Intel 8086 and Motorola 68000, would have meant introducing DMA (direct memory access) hardware and all sorts of other undesirable complications. Acorn did, in fact, adopt the National Semiconductor 32016 as a second processor for the Beeb, but only after first offering a 3-MHz 6502. And so they conceived the idea for the...

Acorn RISC Machine… A, R, M. La arquitectura del chip de tu móvil. O de tu Mac, si tienes uno. Y ahí estáis, viendo, en riguroso directo, su nacimiento. Casi nada.

Y hasta aquí la Byte del mes. Si queréis hacer los deberes para el mes que viene, como siempre, aquí tenéis los archivos de la revista Byte en archive.org.


Y esto habría sido todo… pero el otro día me enteré de la muerte de Stewart Cheifet (hasta el New York Times le dedicó un obituario). ¿Que quién es Stewart Cheifet? No me digáis que no habéis visto nunca su Computer Chronicles. Si Byte es, al menos para mí, uno de los recursos imprescindibles en formato prensa escrita para revisar la historia de la informática, Computer Chronicles es lo mismo, pero en formato vídeo. Los archivos del programa de la PBS, la tele pública de Estados Unidos (lamentablemente en peligro de muerte, gracias a la administración Trump y su alergia a la información de calidad), son un documento esencial si te interesa el periodo de 1983 a 2000. Y como homenaje, y como estas entradas sobre Byte <ironía>no son lo suficientemente largas</ironía>, he pensado que completarlas con el visionado de los programas correspondientes sería, cuando menos, un ejercicio curioso1. Y os dejo aquí los programas de enero del 86…

El 7 de enero el programa arrancaba con… ¡inteligencia artificial!

(¿No os ha encantado el anuncio del patrocinio de Byte? 😅)

No podemos dejar de comentar el copresentador del programa con Cheifet: nada más y nada menos que el malogrado Gary Kildall, creador de CP/M… y de GEM. Hay múltiples universos paralelos al nuestro en que amamos y odiamos a Kildall, CP/M y GEM y no recordamos quién era Bill Gates ni sabemos nada de un sistema operativo llamado Windows.

El Jerrold Kaplan que sale en la primera entrevista, por cierto, trabajaba por aquel entonces con Mitch Kapor, fundó en 1987 Go, dedicada a lo que luego se llamarían PDAs y luego fundaría el primer sitio web de subastas (cinco meses antes de eBay). Not bad. Y también podemos destacar la presencia del filósofo Hubert Dreyfus dudando fuertemente de la expertez de los sistemas expertos de la época :-).

Maravilloso también que los expertos apuntaban que 1986 podría ser el año del reconocimiento del habla 😅.

Después, el día 14, otro tema del que no se habla nada en la actualidad: seguridad informática.

…aunque en aquel momento esto se refería al uso de ordenadores para perseguir delitos, peleándose con catálogos de huellas digitales o usando sistemas de información geográficos, por ejemplo, pero también digitalizando procesos como en cualquier otra organización.

Os recomendaría, eso sí, saltar al minuto 27:30 del vídeo, en el que Cheifet habla de los gráficos de la peli El Secreto de la Pirámide… creados por un «nuevo ordenador gráfico, creado por Industrial Light & Magic, una división de LucasFilm. El ordenador se llama… «Pixar».

Y no sigo porque, según esta lista de episodios en la Wikipedia, el siguiente no se emitiría hasta febrero.

Apa, el mes que viene más (ya decidiremos si solo con Byte o con el añadido de Computer Chronicles).

  1. Un ejercicio curioso que, inevitablemente, no se me ha ocurrido solo a mí: veo que alguien ha montado un computerchronicles.blog y que ya lleva nada menos que los primeros 133 programas revisitados. ↩︎

Byte, diciembre del 85

Toca cerrar el año con nuestra relectura habitual de la revista Byte… de hace cuarenta años. Esta vez, temas de moda… de 2020.

Portada de la revista Byte de diciembre de 1985. El tema es "computer confeerencing", con una ilustración de un conector tipo D con un cable plano... pero en el que los pines del conector son personas sentadas a una mesa.

(Me vais a reconocer que aprovechar un conector tipo D así es, cuando menos, creativo :-).)

Y comenzamos con publicidad, más que por el producto… porque en 1985 ya hacía unos años (pocos, eso sí) que Bill Gates había dicho (o no) que 640 Ks deberían ser suficiente… y ya sabíamos que no.

Anuncio de una ampliación de memoria de dos megabytes, con el eslogan "para aquellas ocasiones en que 640 ks no parecen ser suficientes"

Por cierto. Cuarenta años más tarde, tu ordenador tiene… ¿cuatro mil veces más memoria? ¿Ocho mil?

En la sección «cosas que no son nuevas»…

And Now, Adware

In response to the letter by Mr. Tate ("Don't Sell Software. Sell Ad Space," August, page 26) regarding the selling of advertising space in entertainment software: Wow! What a great idea. Adware (that's my term for it) could resurrect the failing home computer industry

Let's face it, most home computers are used for entertainment; however, the general public is not usually willing to spend S30 to SI 00 for a game. In general I feel that this attitude applies to all types of home entertainment. Look at how successful television has become simply because you don't have to pay for it to enjoy it (unless you want cable or pay TV, but even that is relatively inexpensive). With Adware you would still have to incur the cost of downloading from the telephone. This same reason also accounts for the to-date unsuccessful home videotex systems.

Mr. Tate mentions the advantages of Adware but fails to mention the virtues of the Freeware concept and what Adware could bring to it. I personally do not agree with the idea of selling copy-protected entertainment software commercially. Computers are very good at copying software, and so this fact should be put to good use. Freeware (the free distribution of software by encouraging copying) offers the users a better and more dynamic product. For example. I have a Freeware product that I continue to update as improvements and additions are implemented. When a new release is ready I simply make it available on the Freeware market. You cannot do this economically with a similar commercial product without covering your expenses by raising the retail price. With Adware you could make it a policy to release a new version every few months to insure a dynamic advertising medium.

At present the Freeware distribution network is not firmly established, but if the amount of Freeware and the demand for it grew large enough I am sure that regular channels would establish themselves quickly so that everyone could have almost immediate access to the updates. Another benefit of this concept would be...

…y es que el debate sobre los modelos de financiación del software vendría a ser tan viejo como el propio software.

No nos vamos a saltar nunca las tecnologías para mejorar la accesibilidad de los ordenadores:

The Audiodata/IBM PC Keyboard from Frank Audiodata GmbH of West Germany uses tone and speech capabilities to make the IBM PC accessible to blind and visually impaired users. The system generates different tones depending on the type of data at the cursor's screen location. To position the cursor, you use sliding switches that correspond to the horizontal and vertical axes.

The vertical switch is on the left-hand side of the Audiodata keyboard, next to the function keys. Moving it from top to bottom yields a series of tones that tells you whether lines are blank or full of text. The horizontal switch is below the space bar. Moving it left and right yields tones that indicate letters, spaces, numbers, and punctuation marks in a line. By moving the switches and listening to the resulting tones, you can tell how many characters of what type are at what position on the screen.

The keyboard contains a Votrax SC-01 speech processor, so you can literally have the system read a portion of text out loud. Pressing a button on the vertical switch tells the system to read the line of text that corresponds to its position. Using the vertical and the horizontal switches together, you can have the system read or spell particular words.

The Audiodata keyboard works with standard or large-print monitors or with no monitor at all. It comes with a 6-inch add-in card and the system software for S3450.

Vamos a escandalizarnos, eso sí, de los 3450 dólares de la época que costaba el cacharro. Mucho más barato era este prototouchpad (que más que touchpad era un teclado para macros):

Touchpad Accessory for the IBM PC

Polytel Computer Products has introduced the Keyport 60. a small rectangular touchpad that fits along the top of the IBM PC keyboard. It has 60 touchsensitive regions that can be programmed as function keys and defined in regular and shift modes, so the Keyport 60 will accommodate a maximum of 120 macro commands.

To record a macro, you press the Alt key on your regular keyboard and a touchpad key simultaneously. Any keystrokes that follow are recorded until you press the Alt and touchpad keys a second time.

The touchpad package comes with KPEDIT. a fullscreen editor that allows you to edit key definitions.

Keyport 60 works with the IBM PC, XT, AT, and compatible personal computers, using the joystick adapter to allow concurrent operation with your regular keyboard It costs 399.

Aquí una cosa que no deja de sorprenderme que no hubiesen añadido antes en la revista. Uno se pasó una parte no negligible de los ochenta tecleando código que venía en diferentes revistas. Con mis nulos ingresos en la época, tiempo bien empleado. Pero el comprador de una revista como Byte muy probablemente tenía recursos económicos como para no verlo así:

Página anunciando un nuevo servicio de la revista: la venta de disquetes con el código incluido en la revista. Puedes optar entre disquetes de cinco y cuatro para IBM PC, Apple II en dos tipos de formato, Commodore 64, Hewlett Packard 125, Kaypro 2 CP/M. dps modelos de TRS-80, Texas Instruments Professional, Zaith Z-100 y Atari. En tres pulgadas y media, los formatos son Apple Macintosh, Atari ST, Commodore Amiga, Hewlett Packard 150 y Data General/One.

Los precios van de los cinco dólares de un disco de cinco y cuarto en Estados Unidos a los 8 para discos de tres y medio u ocho pulgadas enviados a Asia, África y Australia.

También se anuncia Bytecom, "conferencias informáticas entre los lectores de Byte en Europa, unas cuantas BBS fuera de los Estados Unidos.

Y, de regalo, nos dan el ránquin de los ordenadores más populares entre los lectores de la revista en la época. Nótese también, primero, que ahí están los discos de 8″ en formato estándar CP/M, y que los precios, teniendo en cuenta que la operación tenía que ser bastante manual, me parecen bastante razonables.

No pongo captura porque el escaneado de las páginas correspondientes en el Archive no está bien (podéis navegar a la página correspondiente de Byte – a visual archive), pero está bien la sección de libros del número. Comienza con la crítica de Back to BASIC: The History, Corruption, and Future of the Language (se puede conseguir en Amazon, curiosamente), en que los diseñadores de BASIC, al parecer, se quejan amargamente de la cosa en que se ha convertido su lenguaje, veinte años después de su creación, y explican por qué lanzaban en 1985 True BASIC (que exista el dominio y se puedan comprar en él versiones del lenguaje actuales y vintage, y manuales, me ha reventado la cabeza). Explican los autores que BASIC no fue diseñado para ser un lenguaje interpretado, sino compilado, y que esto, sumado a las estrecheces de la RAM de los ordenadores en que se estaba usando en los 80, se habían cargado su diseño. Qué cosas.

Más adelante se critica Ethical Issues in the Use of Computers. De nuevo, la digitalización del Archive está mal, pero se puede acceder al visual archive, para recordarnos que este no es un tema de nuestro siglo XXI, precisamente, y que hace cuarenta años ya nos preocupaban los riesgos de las grandes bases de datos para nuestra privacidad, la brecha digital o la toma de decisiones algorítmicas sobre nuestra salud.

Volviendo al servicio de venta de disquetes con el código de la revista, y en nuestra habitual sección «cosas que no se incluirían hoy en una revista ni por casualidad»…

A SIMPL COMPILER PART 1 : THE BASICS

by Jonathan Amsterdam

An implementation of a compiler for a simple structured language

In this article— the first of a three-part series on the construction of a compiler for a high-level language— 1 will discuss the basics of the compiler. Next month 1 will talk about procedures and functions, and in the third part of the series 1 will describe some of the compiler's extensions.

Three of my earlier Programming Projects are prerequisites for this one. "Context-Free Parsing of Arithmetic Expressions" (August, page 138) explains the parsing technique I will be using. "Building a Computer in Software" (October, page 112) describes VM2. the virtual machine for which my compiler is targeted. And "A VM2 Assembler" (November, page 112) details the assembly-language code that the compiler will generate.

The SIMPL Programming Language

I will be describing a compiler for a language of my own design, called SIMPL. SIMPL, which stands for "SIMPL Isn't Much of a Programming Language," isn't much of a programming language. SIMPLs grammar is given in figure 1. There are a few points that are not described by the grammar. An identifier is any string of letters and numbers beginning with a letter. Unlike most implementations of Pascal, SIMPL is case-sensitive, so the identifiers READ and Read mean different things. SIMPL keywords, like PROGRAM and BEGIN, are capitalized. Comments in SIMPL are delimited by braces ({ }). As in Pascal, character constants are delimited by single quotes, but SIMPL also allows the backslash character ( \ ) to act as an escape. When followed by an n or a t, the backslash denotes a new line (carriage return) or tab; when followed by any other character, it denotes that character. For example, the character constant for the single quote looks like ' \ '.

SIMPLs WHILE and IF statements, like those of Modula-2, are explicitly terminated by an END. The AND operator has the same precedence as OR, and both have weaker precedences than those of all other operators, so it is unnecessary to put parentheses around expressions connected by AND and OR. Furthermore, expressions surrounding an AND or OR will be evaluated from left to right, and no more than necessary will be evaluated. For example, in the expression TRUE AND FALSE AND TRUE, the first TRUE will be evaluated and then the FALSE will be...

Y no podemos cerrar la sección sin incluir las interfaces por voz. Si alguien es capaz de viajar al pasado, por favor, tened la delicadeza de no comentarle al autor que a la cosa aún le faltaban unas cuantas décadas.

English Recognition

The ultimate in user-friendliness

Plain English is hardly ever used to communicate with a computer. This is unfortunate because it can be very effective, and programs that recognize and use relatively complex English sentences have been written for microcomputers. English gives you a variety of ways to express complex actions with a minimum of training and program interaction. Menus, on the other hand, are often highly complex and cumbersome— both for the user and the programmer. Special languages are difficult to learn and to design and implement correctly.

Some applications seem to demand a natural-language controlling mechanism (for example, database programs and games). When you design these kinds of programs, it is hard to predict the questions or commands a user might enter. Even in the largest and most expensive custom database systems, there always seem to be questions outside the scope of the programming.

However. English has been used successfully to control database programs. The first public success was LUNAR (Lunar Sciences Natural-Language Information System), which allowed scientists to query a large file

of physical and chemical data on the lunar rock samples brought back by the Apollo 17 mission in December 1972. More recently. Larry Harris of Artificial Intelligence Corporation has been successfully selling a database retrieval system called ROBOT (now INTELLECT) that uses natural English. It runs on IBM machines and licenses for tens of thousands of dollars. R:base CLOUT by MicroRIM. another English-based database-retrieval system, runs on microcomputers, but it's not cheap either. Several game designers have recognized the benefits of using English to communicate with computers and have tried to use it as their controlling mechanism. However, the approach they take seems a bit limited. The games often have trouble recognizing what should be valid directions or questions by the players.

Nos vamos ahora al tema de portada:

AN OVERVIEW OF CONFERENCING 
SYSTEMS

by Brock N. Meeks

A guided tour through COM, EIES, PARTI, NOTEPAD, and other systems

NUOSO LIVES on the African continent. Exactly where he lives and the name of his tribe is not important; Nuoso is a nonperson. Convicted of a crime against his tribal society, he is forbidden to communicate with his family, his friends, in short, with anyone. His communication cut off, Nuoso quickly withdraws from the village. Eventually he will cease to exist even in his own mind, and he will literally die from lack of communication.

Just as people need food, water, and shelter, so they need to communicate. From the earliest days of history, our ancestors sought better ways to communicate. Primitive maps scrawled in the dust gave way to cave paintings, where information retrieval entailed nothing more complicated than remembering the right wall in the right cave. But just as society became more complex, so did the communication needs of the population.

Early telegraph links, in Napoleon's time, had signal speeds of about two characters per second. In 1913 vacuum-tube repeaters were introduced to telephony, and a rapid succession of advancements in the world of electronics followed. In 1918 the first carrier system permitted several voice channels to occupy a single pair of wires. The early 1940s saw highcapacity coaxial cables beginning to replace twisted-pair cables. Microwave links emerged in 1946 with the capacity to carry more than 10,000 telephone channels. Today's phone system uses satellite links and will soon use fiber optics. In a hundred years our communication capability has risen from fifteen to a billion bits per second, from two to over a hundred million characters per second. And all for the sake of improving communication with each other.

The Birth of Computer Conferencing

Early in 1970, political and economic pressures set the stage for the creation of a revolutionary means of communication. In the fall of 1971 the entire economic structure of the United States fell under the control of President Nixon's wage-price freeze. Because of the tremendous need to handle the reporting and information dissemination of the price freeze, the Office of Emergency Preparedness (OEP) commissioned Murray Turoff to create a computerized version of the "conference call." Turoff responded by developing the Emergency Management Information System and Reference Index (EMISARI).

The EMISARI system operated as an electronic network linking the ten OEP regional offices. The new price controls created a nationwide demand for information, guidelines, rulings, official statements, and policy clarifications from businesses, labor unions, and administrators. Because EMISARI eliminated the constraints of time and geographic location, the OEP's regional administrators were able to secure time-critical information at their convenience. The instant access of EMISARI allowed Washington to update policy as it happened and gave all involved the opportunity to respond or ask questions— with both...

Sí, era obvio, querida lectora, que no podíamos estar hablando de los Zoom de los 80, sino de los Discords asíncronos (a todo estirar, que más bien son Reddits lo que se menciona) que podían soportar los ordenadores y redes de telecomunicación de la época.

A destacar: (i) no aparece la palabra «Internet» en la pieza y, (ii) os podéis ir a la página 174 para ver una separata sobre los efectos que iban a tener estos sistemas sobre personas y sociedades. Si no me equivoco, por cierto, el firmante estaba, en aquella época, fundando el mítico The Well.

Volvemos a los anuncios. ¿Cuántos años le echábais a Logitech y sus ratones?

Anuncio del ratón Logitech LogiMouse C7

Vale, unas cuantas le echábais al menos cuarenta. Pero… ¿cuántas sabías que Logitech se dedicaba a los compiladores?

Anuncio del Modula-2/86 de Logitech

(Según la Wikipedia, Logitech se fundó en Suiza en el 81 con el objetivo de crear un procesador de texto, y de ahí se pasaron a los «roedores». En 1984 ya los tenían inalámbricos (por infrarojos). El C7 que tenéis aquí arriba era su primer ratón de venta al público. Vosotros también habríais pagado más de doscientos euros (actualizando la inflación) por tener uno, ¿verdad? Lo de Modula-2, parece ser, fue solo una veleidad temporal.)

No me iré sin darle un repasillo a los ordenadores de la época, c

Dos portátiles de la época y sus especificaciones.

Ambos pesan 4,5 libras (unos dos kilos).

El Tandy 200 tiene procesador 80C85 de 8 bits a 2,4 megahercios, mientras que el NEC PC-8401A tiene un procesador compatible Z80 a cuatro megahercios. El Tandy tiene 24 Ks de RAM, ampliables 74, el NEC viene con 64, expandibles a 96.

Tienen pantallas LCD, de 10 por 16 caracteres el Tandy, 80 por 16 el NEC.

Ambos tienen módems de 300 baudios. Se destaca que ambos tienen baterías.

El Tandy tiene un sistema operativo propio, el NEC es CP/M 2/2. Ambos vienen con suites de software que incluyen procesador de textos y hojas de cálculo.

¿Os habéis fijado que mencionan que ambos pueden funcionar alimentados por su propia batería? (Comentan del NEC, que con cuatro pilas tipo C (¿quién dijo «batería recargable»?) aguantaba un par de horas.) ¿Y que no hablan de ningún tipo de almacenamiento interno? No me molesto en calcular cuánto serían hoy los mil dólares que costaban, pero sí que comentan que el NEC, funcional, se va más bien a más de dos mil… Y también comentaré que la pieza se cierra con un «para qué un laptop» que, dada la tecnología de la época, era una pregunta más que razonable. Oh, los maravillosos ochenta.

Y cerramos con un clásico del software:

The Norton Utilities

Tools for recovering data and 
directories

Peter Nortons data-recovery tool really recovers lost data. I've used it successfully dozens of times. Will it save every lost file? No. Unfortunately, there are some kinds of damage that the Norton Utilities can't repair. Can you tell managing before buying the program whether it will help you recover a particular file? The answer to this question has to be inconclusive. There are different kinds of lost data, and sometimes, even when you know  how the damage occurred, it is difficult to predict whether it can be repaired.

The simplest kind of loss occurs when you delete a file by using the ERASE or DEL commands in DOS. Even though your directory indicates that the file no longer exists, it hasn't really been erased. What's happened is that an instruction prohibiting DOS from writing in certain areas of the disk has been altered. Your data is retained until information is actually written into these sectors. If you change your mind and decide that you need the discarded data after all, the Norton Utilities will reverse the changes made by the ERASE command and your old file will be restored.

Certain types of equipment failures can produce more serious data losses. Every DOS-formatted disk contains hidden files called the boot record and the file allocation table; these, together with the directory, are used for managing the data stored on the rest of the disk. If garbled information is entered into these files— a common cause is a disk drive out of alignment— your data may become inaccessible. But sometimes the data files themselves may survive this damage; if so, you may be able to recover some or all of them.

Retrieving Lost Data

If you plan to use the Norton file-recovery procedure, you should be careful not to write on a disk with a lost file. You'll risk having new data entered in the sectors containing the file you hope to save. Once the old information has been overwritten in this way, it can't be recovered.

If you did write on the disk, there's still a chance that the sectors holding the erased file were not the ones that received the new data; this depends on factors like how much free space was on your disk and which version of DOS you're using. So until you actually begin the recovery procedure, you won't know for sure whether the lost file was destroyed. Still, it's best not to take chances. Make it a rule to never write on a disk containing damaged files.

Another good idea is to make a copy of your damaged file whenever possible. If you're working with floppy disks, you should use the DOS DISKCOPY command (as opposed to the COPY * . * command); DISKCOPY works by reproducing what's on the source disk exactly, byte by byte, so even deleted data is copied. Carrying out the recovery procedure on a copied version of the damaged file means that if you make a mistake, you'll have a chance to recopy the original and try again.

After taking these precautions, you can begin the file-recovery procedures. In general, for simple problems like unintentionally invoking an ERASE command, you can expect the Norton Utilities to retrieve lost files consistently. When a problem is caused by a current spike, static electricity, or a disk drive out of alignment, it's harder to predict how much of a file can be recovered; this is because so many different varieties of error can occur.

With many types of errors, the Norton Utilities can often help you salvage something. At times you may be able to recover only portions of a file. This is similar to what happens when the CHKDSK procedure in DOS restores only some of the lost clusters (groups of isolated sectors) in a file. In many cases you can save enough of a file to be able to reconstruct the missing portions with little trouble. But for some files, such as those created with spreadsheet programs, even a small amount of data loss can

Las Norton Utilities del celebérrimo (y actualmente octogenario) Peter Norton llevaban el mercado desde el 82 (y quizás más sorprendente, se lanzarían versiones nuevas hasta 2012).

Como de costumbre, tenéis los archivos de la revista Byte en archive.org, y si queréis, podéis ir avanzando trabajo con el número de enero ¡del 86! Hasta el año que viene :-).

Byte, noviembre del 85

¡Voy tarde! Es diciembre y en la portada de Byte dice que aún es noviembre (sí, de 1985, claro). Anyway, vamos allá, de urgencia, con el repaso a la revista

La portada de la revista Byte noviembre de 1985. El precio es de tres dólares con noventa y cinco. El tema de portada es Inside the IBM PCs. La ilustración, en blanco y negro, es una figura humana (parece un hombre vestido de traje con un maletín en la mano, frente a un enorme ordenador tipo PC de la época que parece desmontarse en una especie de puzzle tridimensional

Y comenzamos con el que sigue siendo el tema, en 2025, de revistas de informática, entradas de blogs y vídeos de YouTube a tutiplén: utilidades de dominio público:

Public-Domain Utilities

Build an extensive software library for free

by Jon R. Edwards

THE EXTENSIVE public domain collection for the IBM Personal Computer and compatibles is a very valuable resource. It is easily possible to build an extensive software library and incorporate the utilities into your home projects or to save considerable time and effort by installing a RAM (random-access read/write memory)-disk and print spooler. Most programs in the public domain provide source code; you can learn from the code and, more important, you can customize the routines for your own requirements. Undoubtedly, some of the software will fill your needs, and the more obscure programs may simply trigger your imagination.

The notion that "free means shoddy" does not necessarily apply to this software. I suspect that most of the free utilities were originally written to fill individual needs and as part of the "hacker ethic" have been shared with the public. The programs adequately fill many needs, and they have a tendency, as the user community modifies and expands them, to become more and more bug-free and sophisticated. Most public-domain programs provide limited functionality, and their user interfaces and documentation are generally less polished than commercial products, but it is amazing how many commercial products do very little more than integrate the capabilities of programs that already exist in the public domain. If nothing else, exposure to these programs will make you more aware of what to look for and expect from the products you buy. And who knows —in the short descriptions that follow, you may find software that's perfectly suited to your needs. At least the price is right.

Free Software

To the best of my ability, I have concentrated on free, no-strings-attached software and not on shareware or user-supported software. There is, to be sure, a growing amount of shareware for the IBM family, and much of it is excellent (see "Public-Domain Gems" by John Markoff and Ezra Shapiro, March BYTE, page 207), but the products often do not provide source code, and their authors usually request a contribution; most users legitimately feel that the products deserve financial support.

Naturally, I cannot guarantee that the software you download will function as you hope it will. I certainly hope you find dozens of interesting utilities here and that your investigations lead you to new and exciting things, but I take no responsibility if the programs you download do nothing or turn your screen inside out.

Locating free software is getting easier and easier. There are more users groups, bulletin-board systems (BBSs), and public-domain copying services than ever before, and the...

Cuarenta años más tarde seguimos igual de locos por obtener utilidades gratuitas y seguimos teniendo que explicar que «gratis» no necesariamente es «malo». Es curioso, eso sí, comprobar que en 1985 había que explicar que muchas de las utilidades venían con su código fuente («código abierto» se puso de moda a finales de los noventa, dice la Wikipedia). Y a uno le entran sudores fríos pensando en descargarse software de BBS a través de los módems de la época (por mucho que los programas pesaran entonces una miseria al comparalos con los actuales).

Si hacéis click en la página y seguís leyendo encontraréis utilidades de disco, de memoria, de estatus del sistema, de ayuda para el uso del teclado, de manipulación de texto y de archivos, de control de pantalla, pequeñas aplicaciones, utilidades de impresión, software de comunicaciones o lenguajes de programación (Forth, LISP, Logo). Lo de siempre: hemos cambiado, en cuarenta años, pero no tanto como uno podría imaginar.

Creo que llevábamos un tiempo sin fijarnos en la publicidad:

Diez megas en 8 minutos son algo más de 20 kilobytes por segundo (mi conexión de fibra da fácilmente 50 megabytes por segundo, o bastante más de 20 gigas en 8 minutos, y los puertos USB 3 llegan a los 500 megabytes por segundo) por apenas 180 dólares de la época (460 euros de hoy). Quejaos de que el pen USB os va lento y es caro, va… Y si seguimos con el tema, podemos repasar las velocidades de los discos de la época en general:

Factors Affecting Disk Performance

Four major physical factors determine overall disk performance: access time, cylinder size, transfer rate, and average latency.

Access time is the amount of time it takes to move the read/write heads over the desired tracks (cylinders). Once the heads are over the desired tracks, they must settle down from the moving height to the read/write height. This is called the settling time and is normally included in the access time. Specifications for AT and XT disk-drive options are shown in table A.

A cylinder is composed of all tracks that are under the read/write heads at one time. Thus, tracks per cylinder is the same as the number of data heads in the drive. Cylinder size is defined as tracks/cylinder x sectors/track x bytes/sector.

The Quantum Q540, for example, has four platters and eight data heads,

while the Vertex VI 70 has four platters, seven data heads, and one servo head. The difference is that the Quantum drive uses an embedded (or wedge) servo, where the servo signal is embedded on the data tracks, preceding the data portion of each sector on the disk. The Vertex drive uses a dedicated servo that requires its own surface. This difference means that the Quantum drive has 8.5K bytes more data available to it before it must seek the next track; if all other factors were equal (which they aren't), the Quantum would be slightly faster in those cases that required reading that "extra" 8.5K bytes.

Transfer rate is the rate at which data comes off the disk. It depends on rotation rate, bit density, and sector interleaving. The first two factors are practically the same for all AT-compatible 5!4-inch hard disks, but not for all

floppy disks (the AT's spin 20 percent faster than the other PC floppies).

Sector interleaving is used to cut down the effective transfer rate. The interleave factor of 6 used on the XT cuts the effective transfer rate from 5 megabits per second to 0.833 megabit per second. Note that embedded servo disks, such as those used in the XT and the AT, actually spin about 1 percent slower than 3600 revolutions per minute (rpm) to allow for the increased density due to the servo.

Average latency is the time required for a disk to spin one-half of a revolution. For hard disks, which spin at 3600 rpm, the average latency is 8.33 ms (1/3600 rpm x 60 seconds/minute x 0.5 = 8.33 ms per half revolution). This is due to the fact that after the heads finish seeking and settling, you must wait for the required sector to come under the heads.

¿Lo más rápido de la época? 300 kilobytes por segundo. Y ni siquiera me siento viejo recordándolo… ¿Que a qué precio salían, decís?

Four Hard Disks For Under $1000

Inexpensive help for your disk storage space woes

by Richard Grehan

IF YOU ARE a peruser of the back pages of BYTE like most of us. you cannot have failed to notice the plummeting prices of hard-disk systems, particularly those available for the IBM Personal Computer. It is commonplace to find a complete subsystem, including hard disk, controller card, and software, for under $1000.

The advantages of a hard disk should be obvious: Its speed, convenience, and storage space eliminate most of the agonies involved with managing a large pile of floppy disks. If you're interested in setting up a personal bulletin-board system, the purchase of a hard-disk system should be your top priority.

I selected four hard-disk systems from the pages of BYTE and other computer periodicals. My only criterion was that the complete system must cost less than $1000. This article by no means exhausts all the under-$1000 hard disks advertised, but it should give you an idea of some possible trade-offs and troubles if you decide that a hard disk is your PC's next peripheral. Performance and price information is provided in table 1.

The Sider

The Sider is from First Class Peripherals, a Carson City, Nevada, company. An external drive, it is consequently the easiest of the four to install. This also means that the drive has its own power supply; the only added power burden to your PC is the interface card. Additionally, since the Sider does not replace one of your system's floppy-disk drives (all of the internal drives reviewed install in place of one floppy-disk drive), you lose no functionality when you need 
to, say, copy one floppy disk to another. Best of all, you are spared the trouble of digging through the technical manuals to discover which switches on the PC's motherboard you have to flip to configure the IBM as a one-floppy system.

The Sider comes in a rather large (7 1/2 inches tall, I6 1/2 inches long, and 3 1/2 inches wide) creamwhite molded-plastic housing. The hard disk is mounted on its side, and the mechanism is convection-cooled via the case's slotted top. (This slotted top warrants caution: Small objects and certainly fluids could be unwittingly dropped into the inner workings of the unit, inflicting heaven knows what damage.) Since the unit is taller than it is wide, I experienced a notunjustified fear of knocking it over. A rather stiff but comfortably long cable connects the drive to the interface card. The installation and operation guide that comes with the Sider is a small 31 -page booklet. It is clear and easy to read, obviously written for people with an absolute minimum of hardware knowledge. It includes numerous illustrations of what goes where an

Sí. Menos de mil dólares (más de dos mil quinientos de hoy con la inflación) es «inexpensive». ¿Por qué capacidades? 800 dólares te dan un disco externo (súper llevable: 19 por 42 por 9 centímetros, más o menos; no me atrevo a consultar el peso) de diez megas y que «solo» hace falta encender 30 segundos antes que el ordenador (lo juro, haced clic en la imagen, pasad página y leed). Uno de los internos, el SyQuest (compañía que duraría hasta su bancarrota en 1998), llega a la barbaridad de 30 megabytes #madreDelAmorHermoso. Y si hay que economizar, tenéis el Rodime, que os da 10 megas por apenas 500 dólares. Me los quitan de las manos. Bendita ley de Moore (y familia).

¿Otra cosa que no es exactamente reciente? Dame un problema, no importa qué problema, y alguien te lo resolverá con una hoja de cálculo:

Circuit Design with Lotus 1-2-3

Use the famous spreadsheet to design circuits and print out schematics

by John L. Haynes

SPREADSHEETS, especially those with graphics, are not just for business applications; they can be of great help to circuit designers or anyone else designing systems that can be described by equations.

As an example, let's take a look at the application of one spreadsheet, Lotus 1-2-3, to one technical problem, electronic circuit design and analysis. We'll look at both digital and linear circuits.

Digital Circuits

Digital circuits are built from logic building blocks— inverters, NAND gates, flip-flops, etc. We can simulate each of these components with the equations in a cell of a spreadsheet, using the spreadsheet's built-in logical operators shown in figure 1. For instance, in the spreadsheet portion of Lotus 1-2-3, the equivalent of an inverter is the operator #NOT#, structured as #NOT#(A= 1). This structure means the state of the operator #NOT# is not true, or equal to a logical 0, if the state in the parentheses is true. This is equivalent to the output of an inverter circuit whose input is A. Similarly, the model of a NAND gate, #NOT# (A=1#AND#B = 1). is not true if input A and input B are both true. The flip-flop is a bit more complex, since its output depends not only on its input conditions but on the transition of a clock pulse. For simplicity, let's assume that there is a narrow clock pulse that triggers the flip-flop whenever the clock pulse is true— in other words, whenever its logic state is a logical 1. The Q output remains in its present state until the clock is true; it then assumes the state of the input D. The O' output is the logical opposite of Q.

These actions are easily simulated using the logical @IF function. It is structured as @IF(AB,C) and means IF A THEN B ELSE C. That is, if the logical condition of A is true, then the function equals B. Otherwise, the function equals C. Setting the variables as @IF(C= 1 , D,Q). we can interpret the state of the function as: If the clock C is true, the state is equal to D; otherwise, it remains Q. The Q' output is handled with the #NOT# operator.

Given the ability to simulate logic components with spreadsheet functions and operators, let's now look at how we can use this technique to build a simple digital circuit. The synchronizing circuit of figure 2 is a commonly encountered arrangement. Known variously as an edge detector, a synchronizing circuit, and a digital differentiator, it develops a pulse one clock period long when an external,

Diseño de circuitos electrónicos con Lotus 1-2-3. En serio. No es una inocentada. O sí, pero suprema.

Y recupero mi tema fetiche, «cosas que ni en broma se publicarían hoy en día en una revista generalista»:

One Million Primes Through the Sieve

Generate a million primes on your IBM PC without running out of memory

by T. A. Peng

A POPULAR WAY to benchmark microcomputers is with the Sieve of Eratosthenes. It is a simple and effective method for generating prime numbers. However, if you try to use the Sieve to obtain more than a few thousand primes on your IBM PC, you will soon encounter the dreaded phrase, "Out of memory." You would think, then, that as far as microcomputers are concerned, the Sieve of Eratosthenes would be an impractical way to generate a large number of primes. This is not so. Let me show you how to use the Sieve to generate a million primes on your microcomputer. Listing 1 (written in Microsoft BASIC) illustrates how, with very little memory, you can put 500.000 numbers through the Sieve to obtain all the primes less than 1,000,000. The idea is quite simple. Use an array of flags to represent the first 1000 odd numbers. After the nonprimes among them have been sieved out, reinitialize the array to represent the next 1000 odd numbers. Lines 120 through 140 initialize the array and lines 340 through 360 reinitialize it before you use it for the next 1000 numbers. The largest prime whose square is less than 1,000,000 is 997 and it is the 168th primestarting with the prime 2. To generate all the primes less than 1,000,000, you don't have to use primes larger than 997, This is the reason for line 220 and for the size of two of the arrays in line 110. The loop in lines 240 through 270 flags all numbers less than 1000 that do not yield primes. (We have K = I + nP, so that K + K + 1 = (I + I + 1) + 2nP = P(2n + 1), which is not a prime.) After each loop is executed, the value of K will be greater than 1000 (and K would flag the next number if the size of the array were larger) and this is remembered as K(C). The variable C keeps count of the primes generated with C - 1 as the actual number of primes generated at the end of each loop. Line 390 assures that the value of K lies between 1 and 1000. You need line 460 to give the correct value for the prime Q in line 490. All the variables except C, Q, and R are integer-valued. There is a reason for this. If the program executes correctly, the output of line 540 should read, "999,983 is the 78,498th prime and the largest less than 1,000,000."

It is clear how to modify listing 1 to generate all the primes less than 2,000,000 or even 10,000,000, but to get a predetermined number of primes, we need to know a little about their distribution. Specifically, what we need to know is the size of the arrays K and P and the largest prime to be used in the Sieve. And in order to know this, we must have a rough idea of how large the...

La criba de Erastótenes, amigas y amigos. Que, por cierto, no es un algoritmo especialmente complicado de entender (dejamos como ejercicio para la lectora girar la página e intentar entender el código en BASIC de la siguiente página :-)). Ahora me han enrado ganas de comprobar cuánta RAM consume el programita en Python que genera ChatGPT en menos tiempo del que necesitarías para teclear las tres primeras líneas del programa propuesto en la revista… pero no las suficientes como para hacerlo de verdad O:-).

Y para cerrar… la multitarea:

Top View

IBM's long-awaited multitasking program makes its debut

BY TJ Byers

TOPVIEW is a multitasking program that, for $149, enables your IBM Personal Computer to install more than one program in the system. This is different from the window programs that presently claim to accomplish the same thing. When working with windows, you must quit a program before you can begin another. With TopView, however, you don't have to quit either one of them. Both can be resident on the screen— and. more important, in the microprocessor—at the same time.

Multitasking

TopView's multitasking capabilities allow several programs to run simultaneously (see photo 1). This isn't the same thing as switching between programs without quitting them; it means that you can actually have one program running in the background while using another. Let's say, for example, that you need to calculate a large spreadsheet, and the job will take several minutes. Instead of staring idly at the screen while the computer crunches away, you can banish the spreadsheet to TopView's background mode and proceed to work on another program— the computer will handle both tasks at the same time. While one program is making calculations in the background, the other can be receiving data from the keyboard. You lose no time waiting for one program to finish before you start the other.

Multitasking is not a new concept. Mainframe computers have used multitasking for many years to enhance their performance. What is new, however, is putting multitasking capabilities into a personal computer.

TopView brings multitasking to the IBM PC using a multiplexing technique known as time slicing. Basically, TopView divides the microprocessor's time into slots during which it switches rapidly from one program to another. The time slices are very short, on the order of milliseconds, and the switching action is not apparent to either the application program or the user, so the programs appear to be running concurrently on the machine. In actuality, they are processed consecutively in very quick order. The procedure gives a single computer the ability to run more than one program at a time.

Multitasking is not without its faults, however. While one program is being processed, the others are held in suspension. Consequently, the programs tend to run more slowly. The more programs you have running at the same time, the slower each apparently becomes. A quick benchmark test using TopView to conduct a simple word search of Writing Assistant on an IBM PC AT showed that it took a full 14 seconds to search a typical 3000-word file as...

Y es que, en 1985, que un ordenador personal fuese capaz de ejecutar múltiples programas en paralelo no era exactamente trivial. Tanto no lo era que no resultaba descabellado cobrar 150 dólares por el programa para hacerlo. Aunque te redujese un 75% el rendimiento del software (cosa que solo ibas a notar cuando ejecutases programas intensivos en cálculo, claro, pero eras tú quien tenía que pensar en ello) o se te comiese buena parte de la RAM del ordenador.

Por cierto: las interfaces «de ventanas» de la época no tenían precio (aunque, de hecho, hoy se están poniendo los programas «TUI», en un maravilloso retorno al pasado :-)).

Un par de fotos de los intentos de mostrar varias aplicaciones en pantalla usando una interfaz puramente textual. No me veo capaz de hacer una descripción fidedigna.

En fin, lo dejamos aquí, que vamos tarde. El mes que viene Dentro de unos días (seguramente semanas), más.

Como de costumbre, tenéis los archivos de la revista Byte en archive.org, y si queréis, podéis ir avanzando trabajo con el número de diciembre.