Byte, diciembre del 85

Toca cerrar el año con nuestra relectura habitual de la revista Byte… de hace cuarenta años. Esta vez, temas de moda… de 2020.

Portada de la revista Byte de diciembre de 1985. El tema es "computer confeerencing", con una ilustración de un conector tipo D con un cable plano... pero en el que los pines del conector son personas sentadas a una mesa.

(Me vais a reconocer que aprovechar un conector tipo D así es, cuando menos, creativo :-).)

Y comenzamos con publicidad, más que por el producto… porque en 1985 ya hacía unos años (pocos, eso sí) que Bill Gates había dicho (o no) que 640 Ks deberían ser suficiente… y ya sabíamos que no.

Anuncio de una ampliación de memoria de dos megabytes, con el eslogan "para aquellas ocasiones en que 640 ks no parecen ser suficientes"

Por cierto. Cuarenta años más tarde, tu ordenador tiene… ¿cuatro mil veces más memoria? ¿Ocho mil?

En la sección «cosas que no son nuevas»…

And Now, Adware

In response to the letter by Mr. Tate ("Don't Sell Software. Sell Ad Space," August, page 26) regarding the selling of advertising space in entertainment software: Wow! What a great idea. Adware (that's my term for it) could resurrect the failing home computer industry

Let's face it, most home computers are used for entertainment; however, the general public is not usually willing to spend S30 to SI 00 for a game. In general I feel that this attitude applies to all types of home entertainment. Look at how successful television has become simply because you don't have to pay for it to enjoy it (unless you want cable or pay TV, but even that is relatively inexpensive). With Adware you would still have to incur the cost of downloading from the telephone. This same reason also accounts for the to-date unsuccessful home videotex systems.

Mr. Tate mentions the advantages of Adware but fails to mention the virtues of the Freeware concept and what Adware could bring to it. I personally do not agree with the idea of selling copy-protected entertainment software commercially. Computers are very good at copying software, and so this fact should be put to good use. Freeware (the free distribution of software by encouraging copying) offers the users a better and more dynamic product. For example. I have a Freeware product that I continue to update as improvements and additions are implemented. When a new release is ready I simply make it available on the Freeware market. You cannot do this economically with a similar commercial product without covering your expenses by raising the retail price. With Adware you could make it a policy to release a new version every few months to insure a dynamic advertising medium.

At present the Freeware distribution network is not firmly established, but if the amount of Freeware and the demand for it grew large enough I am sure that regular channels would establish themselves quickly so that everyone could have almost immediate access to the updates. Another benefit of this concept would be...

…y es que el debate sobre los modelos de financiación del software vendría a ser tan viejo como el propio software.

No nos vamos a saltar nunca las tecnologías para mejorar la accesibilidad de los ordenadores:

The Audiodata/IBM PC Keyboard from Frank Audiodata GmbH of West Germany uses tone and speech capabilities to make the IBM PC accessible to blind and visually impaired users. The system generates different tones depending on the type of data at the cursor's screen location. To position the cursor, you use sliding switches that correspond to the horizontal and vertical axes.

The vertical switch is on the left-hand side of the Audiodata keyboard, next to the function keys. Moving it from top to bottom yields a series of tones that tells you whether lines are blank or full of text. The horizontal switch is below the space bar. Moving it left and right yields tones that indicate letters, spaces, numbers, and punctuation marks in a line. By moving the switches and listening to the resulting tones, you can tell how many characters of what type are at what position on the screen.

The keyboard contains a Votrax SC-01 speech processor, so you can literally have the system read a portion of text out loud. Pressing a button on the vertical switch tells the system to read the line of text that corresponds to its position. Using the vertical and the horizontal switches together, you can have the system read or spell particular words.

The Audiodata keyboard works with standard or large-print monitors or with no monitor at all. It comes with a 6-inch add-in card and the system software for S3450.

Vamos a escandalizarnos, eso sí, de los 3450 dólares de la época que costaba el cacharro. Mucho más barato era este prototouchpad (que más que touchpad era un teclado para macros):

Touchpad Accessory for the IBM PC

Polytel Computer Products has introduced the Keyport 60. a small rectangular touchpad that fits along the top of the IBM PC keyboard. It has 60 touchsensitive regions that can be programmed as function keys and defined in regular and shift modes, so the Keyport 60 will accommodate a maximum of 120 macro commands.

To record a macro, you press the Alt key on your regular keyboard and a touchpad key simultaneously. Any keystrokes that follow are recorded until you press the Alt and touchpad keys a second time.

The touchpad package comes with KPEDIT. a fullscreen editor that allows you to edit key definitions.

Keyport 60 works with the IBM PC, XT, AT, and compatible personal computers, using the joystick adapter to allow concurrent operation with your regular keyboard It costs 399.

Aquí una cosa que no deja de sorprenderme que no hubiesen añadido antes en la revista. Uno se pasó una parte no negligible de los ochenta tecleando código que venía en diferentes revistas. Con mis nulos ingresos en la época, tiempo bien empleado. Pero el comprador de una revista como Byte muy probablemente tenía recursos económicos como para no verlo así:

Página anunciando un nuevo servicio de la revista: la venta de disquetes con el código incluido en la revista. Puedes optar entre disquetes de cinco y cuatro para IBM PC, Apple II en dos tipos de formato, Commodore 64, Hewlett Packard 125, Kaypro 2 CP/M. dps modelos de TRS-80, Texas Instruments Professional, Zaith Z-100 y Atari. En tres pulgadas y media, los formatos son Apple Macintosh, Atari ST, Commodore Amiga, Hewlett Packard 150 y Data General/One.

Los precios van de los cinco dólares de un disco de cinco y cuarto en Estados Unidos a los 8 para discos de tres y medio u ocho pulgadas enviados a Asia, África y Australia.

También se anuncia Bytecom, "conferencias informáticas entre los lectores de Byte en Europa, unas cuantas BBS fuera de los Estados Unidos.

Y, de regalo, nos dan el ránquin de los ordenadores más populares entre los lectores de la revista en la época. Nótese también, primero, que ahí están los discos de 8″ en formato estándar CP/M, y que los precios, teniendo en cuenta que la operación tenía que ser bastante manual, me parecen bastante razonables.

No pongo captura porque el escaneado de las páginas correspondientes en el Archive no está bien (podéis navegar a la página correspondiente de Byte – a visual archive), pero está bien la sección de libros del número. Comienza con la crítica de Back to BASIC: The History, Corruption, and Future of the Language (se puede conseguir en Amazon, curiosamente), en que los diseñadores de BASIC, al parecer, se quejan amargamente de la cosa en que se ha convertido su lenguaje, veinte años después de su creación, y explican por qué lanzaban en 1985 True BASIC (que exista el dominio y se puedan comprar en él versiones del lenguaje actuales y vintage, y manuales, me ha reventado la cabeza). Explican los autores que BASIC no fue diseñado para ser un lenguaje interpretado, sino compilado, y que esto, sumado a las estrecheces de la RAM de los ordenadores en que se estaba usando en los 80, se habían cargado su diseño. Qué cosas.

Más adelante se critica Ethical Issues in the Use of Computers. De nuevo, la digitalización del Archive está mal, pero se puede acceder al visual archive, para recordarnos que este no es un tema de nuestro siglo XXI, precisamente, y que hace cuarenta años ya nos preocupaban los riesgos de las grandes bases de datos para nuestra privacidad, la brecha digital o la toma de decisiones algorítmicas sobre nuestra salud.

Volviendo al servicio de venta de disquetes con el código de la revista, y en nuestra habitual sección «cosas que no se incluirían hoy en una revista ni por casualidad»…

A SIMPL COMPILER PART 1 : THE BASICS

by Jonathan Amsterdam

An implementation of a compiler for a simple structured language

In this article— the first of a three-part series on the construction of a compiler for a high-level language— 1 will discuss the basics of the compiler. Next month 1 will talk about procedures and functions, and in the third part of the series 1 will describe some of the compiler's extensions.

Three of my earlier Programming Projects are prerequisites for this one. "Context-Free Parsing of Arithmetic Expressions" (August, page 138) explains the parsing technique I will be using. "Building a Computer in Software" (October, page 112) describes VM2. the virtual machine for which my compiler is targeted. And "A VM2 Assembler" (November, page 112) details the assembly-language code that the compiler will generate.

The SIMPL Programming Language

I will be describing a compiler for a language of my own design, called SIMPL. SIMPL, which stands for "SIMPL Isn't Much of a Programming Language," isn't much of a programming language. SIMPLs grammar is given in figure 1. There are a few points that are not described by the grammar. An identifier is any string of letters and numbers beginning with a letter. Unlike most implementations of Pascal, SIMPL is case-sensitive, so the identifiers READ and Read mean different things. SIMPL keywords, like PROGRAM and BEGIN, are capitalized. Comments in SIMPL are delimited by braces ({ }). As in Pascal, character constants are delimited by single quotes, but SIMPL also allows the backslash character ( \ ) to act as an escape. When followed by an n or a t, the backslash denotes a new line (carriage return) or tab; when followed by any other character, it denotes that character. For example, the character constant for the single quote looks like ' \ '.

SIMPLs WHILE and IF statements, like those of Modula-2, are explicitly terminated by an END. The AND operator has the same precedence as OR, and both have weaker precedences than those of all other operators, so it is unnecessary to put parentheses around expressions connected by AND and OR. Furthermore, expressions surrounding an AND or OR will be evaluated from left to right, and no more than necessary will be evaluated. For example, in the expression TRUE AND FALSE AND TRUE, the first TRUE will be evaluated and then the FALSE will be...

Y no podemos cerrar la sección sin incluir las interfaces por voz. Si alguien es capaz de viajar al pasado, por favor, tened la delicadeza de no comentarle al autor que a la cosa aún le faltaban unas cuantas décadas.

English Recognition

The ultimate in user-friendliness

Plain English is hardly ever used to communicate with a computer. This is unfortunate because it can be very effective, and programs that recognize and use relatively complex English sentences have been written for microcomputers. English gives you a variety of ways to express complex actions with a minimum of training and program interaction. Menus, on the other hand, are often highly complex and cumbersome— both for the user and the programmer. Special languages are difficult to learn and to design and implement correctly.

Some applications seem to demand a natural-language controlling mechanism (for example, database programs and games). When you design these kinds of programs, it is hard to predict the questions or commands a user might enter. Even in the largest and most expensive custom database systems, there always seem to be questions outside the scope of the programming.

However. English has been used successfully to control database programs. The first public success was LUNAR (Lunar Sciences Natural-Language Information System), which allowed scientists to query a large file

of physical and chemical data on the lunar rock samples brought back by the Apollo 17 mission in December 1972. More recently. Larry Harris of Artificial Intelligence Corporation has been successfully selling a database retrieval system called ROBOT (now INTELLECT) that uses natural English. It runs on IBM machines and licenses for tens of thousands of dollars. R:base CLOUT by MicroRIM. another English-based database-retrieval system, runs on microcomputers, but it's not cheap either. Several game designers have recognized the benefits of using English to communicate with computers and have tried to use it as their controlling mechanism. However, the approach they take seems a bit limited. The games often have trouble recognizing what should be valid directions or questions by the players.

Nos vamos ahora al tema de portada:

AN OVERVIEW OF CONFERENCING 
SYSTEMS

by Brock N. Meeks

A guided tour through COM, EIES, PARTI, NOTEPAD, and other systems

NUOSO LIVES on the African continent. Exactly where he lives and the name of his tribe is not important; Nuoso is a nonperson. Convicted of a crime against his tribal society, he is forbidden to communicate with his family, his friends, in short, with anyone. His communication cut off, Nuoso quickly withdraws from the village. Eventually he will cease to exist even in his own mind, and he will literally die from lack of communication.

Just as people need food, water, and shelter, so they need to communicate. From the earliest days of history, our ancestors sought better ways to communicate. Primitive maps scrawled in the dust gave way to cave paintings, where information retrieval entailed nothing more complicated than remembering the right wall in the right cave. But just as society became more complex, so did the communication needs of the population.

Early telegraph links, in Napoleon's time, had signal speeds of about two characters per second. In 1913 vacuum-tube repeaters were introduced to telephony, and a rapid succession of advancements in the world of electronics followed. In 1918 the first carrier system permitted several voice channels to occupy a single pair of wires. The early 1940s saw highcapacity coaxial cables beginning to replace twisted-pair cables. Microwave links emerged in 1946 with the capacity to carry more than 10,000 telephone channels. Today's phone system uses satellite links and will soon use fiber optics. In a hundred years our communication capability has risen from fifteen to a billion bits per second, from two to over a hundred million characters per second. And all for the sake of improving communication with each other.

The Birth of Computer Conferencing

Early in 1970, political and economic pressures set the stage for the creation of a revolutionary means of communication. In the fall of 1971 the entire economic structure of the United States fell under the control of President Nixon's wage-price freeze. Because of the tremendous need to handle the reporting and information dissemination of the price freeze, the Office of Emergency Preparedness (OEP) commissioned Murray Turoff to create a computerized version of the "conference call." Turoff responded by developing the Emergency Management Information System and Reference Index (EMISARI).

The EMISARI system operated as an electronic network linking the ten OEP regional offices. The new price controls created a nationwide demand for information, guidelines, rulings, official statements, and policy clarifications from businesses, labor unions, and administrators. Because EMISARI eliminated the constraints of time and geographic location, the OEP's regional administrators were able to secure time-critical information at their convenience. The instant access of EMISARI allowed Washington to update policy as it happened and gave all involved the opportunity to respond or ask questions— with both...

Sí, era obvio, querida lectora, que no podíamos estar hablando de los Zoom de los 80, sino de los Discords asíncronos (a todo estirar, que más bien son Reddits lo que se menciona) que podían soportar los ordenadores y redes de telecomunicación de la época.

A destacar: (i) no aparece la palabra «Internet» en la pieza y, (ii) os podéis ir a la página 174 para ver una separata sobre los efectos que iban a tener estos sistemas sobre personas y sociedades. Si no me equivoco, por cierto, el firmante estaba, en aquella época, fundando el mítico The Well.

Volvemos a los anuncios. ¿Cuántos años le echábais a Logitech y sus ratones?

Anuncio del ratón Logitech LogiMouse C7

Vale, unas cuantas le echábais al menos cuarenta. Pero… ¿cuántas sabías que Logitech se dedicaba a los compiladores?

Anuncio del Modula-2/86 de Logitech

(Según la Wikipedia, Logitech se fundó en Suiza en el 81 con el objetivo de crear un procesador de texto, y de ahí se pasaron a los «roedores». En 1984 ya los tenían inalámbricos (por infrarojos). El C7 que tenéis aquí arriba era su primer ratón de venta al público. Vosotros también habríais pagado más de doscientos euros (actualizando la inflación) por tener uno, ¿verdad? Lo de Modula-2, parece ser, fue solo una veleidad temporal.)

No me iré sin darle un repasillo a los ordenadores de la época, c

Dos portátiles de la época y sus especificaciones.

Ambos pesan 4,5 libras (unos dos kilos).

El Tandy 200 tiene procesador 80C85 de 8 bits a 2,4 megahercios, mientras que el NEC PC-8401A tiene un procesador compatible Z80 a cuatro megahercios. El Tandy tiene 24 Ks de RAM, ampliables 74, el NEC viene con 64, expandibles a 96.

Tienen pantallas LCD, de 10 por 16 caracteres el Tandy, 80 por 16 el NEC.

Ambos tienen módems de 300 baudios. Se destaca que ambos tienen baterías.

El Tandy tiene un sistema operativo propio, el NEC es CP/M 2/2. Ambos vienen con suites de software que incluyen procesador de textos y hojas de cálculo.

¿Os habéis fijado que mencionan que ambos pueden funcionar alimentados por su propia batería? (Comentan del NEC, que con cuatro pilas tipo C (¿quién dijo «batería recargable»?) aguantaba un par de horas.) ¿Y que no hablan de ningún tipo de almacenamiento interno? No me molesto en calcular cuánto serían hoy los mil dólares que costaban, pero sí que comentan que el NEC, funcional, se va más bien a más de dos mil… Y también comentaré que la pieza se cierra con un «para qué un laptop» que, dada la tecnología de la época, era una pregunta más que razonable. Oh, los maravillosos ochenta.

Y cerramos con un clásico del software:

The Norton Utilities

Tools for recovering data and 
directories

Peter Nortons data-recovery tool really recovers lost data. I've used it successfully dozens of times. Will it save every lost file? No. Unfortunately, there are some kinds of damage that the Norton Utilities can't repair. Can you tell managing before buying the program whether it will help you recover a particular file? The answer to this question has to be inconclusive. There are different kinds of lost data, and sometimes, even when you know  how the damage occurred, it is difficult to predict whether it can be repaired.

The simplest kind of loss occurs when you delete a file by using the ERASE or DEL commands in DOS. Even though your directory indicates that the file no longer exists, it hasn't really been erased. What's happened is that an instruction prohibiting DOS from writing in certain areas of the disk has been altered. Your data is retained until information is actually written into these sectors. If you change your mind and decide that you need the discarded data after all, the Norton Utilities will reverse the changes made by the ERASE command and your old file will be restored.

Certain types of equipment failures can produce more serious data losses. Every DOS-formatted disk contains hidden files called the boot record and the file allocation table; these, together with the directory, are used for managing the data stored on the rest of the disk. If garbled information is entered into these files— a common cause is a disk drive out of alignment— your data may become inaccessible. But sometimes the data files themselves may survive this damage; if so, you may be able to recover some or all of them.

Retrieving Lost Data

If you plan to use the Norton file-recovery procedure, you should be careful not to write on a disk with a lost file. You'll risk having new data entered in the sectors containing the file you hope to save. Once the old information has been overwritten in this way, it can't be recovered.

If you did write on the disk, there's still a chance that the sectors holding the erased file were not the ones that received the new data; this depends on factors like how much free space was on your disk and which version of DOS you're using. So until you actually begin the recovery procedure, you won't know for sure whether the lost file was destroyed. Still, it's best not to take chances. Make it a rule to never write on a disk containing damaged files.

Another good idea is to make a copy of your damaged file whenever possible. If you're working with floppy disks, you should use the DOS DISKCOPY command (as opposed to the COPY * . * command); DISKCOPY works by reproducing what's on the source disk exactly, byte by byte, so even deleted data is copied. Carrying out the recovery procedure on a copied version of the damaged file means that if you make a mistake, you'll have a chance to recopy the original and try again.

After taking these precautions, you can begin the file-recovery procedures. In general, for simple problems like unintentionally invoking an ERASE command, you can expect the Norton Utilities to retrieve lost files consistently. When a problem is caused by a current spike, static electricity, or a disk drive out of alignment, it's harder to predict how much of a file can be recovered; this is because so many different varieties of error can occur.

With many types of errors, the Norton Utilities can often help you salvage something. At times you may be able to recover only portions of a file. This is similar to what happens when the CHKDSK procedure in DOS restores only some of the lost clusters (groups of isolated sectors) in a file. In many cases you can save enough of a file to be able to reconstruct the missing portions with little trouble. But for some files, such as those created with spreadsheet programs, even a small amount of data loss can

Las Norton Utilities del celebérrimo (y actualmente octogenario) Peter Norton llevaban el mercado desde el 82 (y quizás más sorprendente, se lanzarían versiones nuevas hasta 2012).

Como de costumbre, tenéis los archivos de la revista Byte en archive.org, y si queréis, podéis ir avanzando trabajo con el número de enero ¡del 86! Hasta el año que viene :-).

Byte, noviembre del 85

¡Voy tarde! Es diciembre y en la portada de Byte dice que aún es noviembre (sí, de 1985, claro). Anyway, vamos allá, de urgencia, con el repaso a la revista

La portada de la revista Byte noviembre de 1985. El precio es de tres dólares con noventa y cinco. El tema de portada es Inside the IBM PCs. La ilustración, en blanco y negro, es una figura humana (parece un hombre vestido de traje con un maletín en la mano, frente a un enorme ordenador tipo PC de la época que parece desmontarse en una especie de puzzle tridimensional

Y comenzamos con el que sigue siendo el tema, en 2025, de revistas de informática, entradas de blogs y vídeos de YouTube a tutiplén: utilidades de dominio público:

Public-Domain Utilities

Build an extensive software library for free

by Jon R. Edwards

THE EXTENSIVE public domain collection for the IBM Personal Computer and compatibles is a very valuable resource. It is easily possible to build an extensive software library and incorporate the utilities into your home projects or to save considerable time and effort by installing a RAM (random-access read/write memory)-disk and print spooler. Most programs in the public domain provide source code; you can learn from the code and, more important, you can customize the routines for your own requirements. Undoubtedly, some of the software will fill your needs, and the more obscure programs may simply trigger your imagination.

The notion that "free means shoddy" does not necessarily apply to this software. I suspect that most of the free utilities were originally written to fill individual needs and as part of the "hacker ethic" have been shared with the public. The programs adequately fill many needs, and they have a tendency, as the user community modifies and expands them, to become more and more bug-free and sophisticated. Most public-domain programs provide limited functionality, and their user interfaces and documentation are generally less polished than commercial products, but it is amazing how many commercial products do very little more than integrate the capabilities of programs that already exist in the public domain. If nothing else, exposure to these programs will make you more aware of what to look for and expect from the products you buy. And who knows —in the short descriptions that follow, you may find software that's perfectly suited to your needs. At least the price is right.

Free Software

To the best of my ability, I have concentrated on free, no-strings-attached software and not on shareware or user-supported software. There is, to be sure, a growing amount of shareware for the IBM family, and much of it is excellent (see "Public-Domain Gems" by John Markoff and Ezra Shapiro, March BYTE, page 207), but the products often do not provide source code, and their authors usually request a contribution; most users legitimately feel that the products deserve financial support.

Naturally, I cannot guarantee that the software you download will function as you hope it will. I certainly hope you find dozens of interesting utilities here and that your investigations lead you to new and exciting things, but I take no responsibility if the programs you download do nothing or turn your screen inside out.

Locating free software is getting easier and easier. There are more users groups, bulletin-board systems (BBSs), and public-domain copying services than ever before, and the...

Cuarenta años más tarde seguimos igual de locos por obtener utilidades gratuitas y seguimos teniendo que explicar que «gratis» no necesariamente es «malo». Es curioso, eso sí, comprobar que en 1985 había que explicar que muchas de las utilidades venían con su código fuente («código abierto» se puso de moda a finales de los noventa, dice la Wikipedia). Y a uno le entran sudores fríos pensando en descargarse software de BBS a través de los módems de la época (por mucho que los programas pesaran entonces una miseria al comparalos con los actuales).

Si hacéis click en la página y seguís leyendo encontraréis utilidades de disco, de memoria, de estatus del sistema, de ayuda para el uso del teclado, de manipulación de texto y de archivos, de control de pantalla, pequeñas aplicaciones, utilidades de impresión, software de comunicaciones o lenguajes de programación (Forth, LISP, Logo). Lo de siempre: hemos cambiado, en cuarenta años, pero no tanto como uno podría imaginar.

Creo que llevábamos un tiempo sin fijarnos en la publicidad:

Diez megas en 8 minutos son algo más de 20 kilobytes por segundo (mi conexión de fibra da fácilmente 50 megabytes por segundo, o bastante más de 20 gigas en 8 minutos, y los puertos USB 3 llegan a los 500 megabytes por segundo) por apenas 180 dólares de la época (460 euros de hoy). Quejaos de que el pen USB os va lento y es caro, va… Y si seguimos con el tema, podemos repasar las velocidades de los discos de la época en general:

Factors Affecting Disk Performance

Four major physical factors determine overall disk performance: access time, cylinder size, transfer rate, and average latency.

Access time is the amount of time it takes to move the read/write heads over the desired tracks (cylinders). Once the heads are over the desired tracks, they must settle down from the moving height to the read/write height. This is called the settling time and is normally included in the access time. Specifications for AT and XT disk-drive options are shown in table A.

A cylinder is composed of all tracks that are under the read/write heads at one time. Thus, tracks per cylinder is the same as the number of data heads in the drive. Cylinder size is defined as tracks/cylinder x sectors/track x bytes/sector.

The Quantum Q540, for example, has four platters and eight data heads,

while the Vertex VI 70 has four platters, seven data heads, and one servo head. The difference is that the Quantum drive uses an embedded (or wedge) servo, where the servo signal is embedded on the data tracks, preceding the data portion of each sector on the disk. The Vertex drive uses a dedicated servo that requires its own surface. This difference means that the Quantum drive has 8.5K bytes more data available to it before it must seek the next track; if all other factors were equal (which they aren't), the Quantum would be slightly faster in those cases that required reading that "extra" 8.5K bytes.

Transfer rate is the rate at which data comes off the disk. It depends on rotation rate, bit density, and sector interleaving. The first two factors are practically the same for all AT-compatible 5!4-inch hard disks, but not for all

floppy disks (the AT's spin 20 percent faster than the other PC floppies).

Sector interleaving is used to cut down the effective transfer rate. The interleave factor of 6 used on the XT cuts the effective transfer rate from 5 megabits per second to 0.833 megabit per second. Note that embedded servo disks, such as those used in the XT and the AT, actually spin about 1 percent slower than 3600 revolutions per minute (rpm) to allow for the increased density due to the servo.

Average latency is the time required for a disk to spin one-half of a revolution. For hard disks, which spin at 3600 rpm, the average latency is 8.33 ms (1/3600 rpm x 60 seconds/minute x 0.5 = 8.33 ms per half revolution). This is due to the fact that after the heads finish seeking and settling, you must wait for the required sector to come under the heads.

¿Lo más rápido de la época? 300 kilobytes por segundo. Y ni siquiera me siento viejo recordándolo… ¿Que a qué precio salían, decís?

Four Hard Disks For Under $1000

Inexpensive help for your disk storage space woes

by Richard Grehan

IF YOU ARE a peruser of the back pages of BYTE like most of us. you cannot have failed to notice the plummeting prices of hard-disk systems, particularly those available for the IBM Personal Computer. It is commonplace to find a complete subsystem, including hard disk, controller card, and software, for under $1000.

The advantages of a hard disk should be obvious: Its speed, convenience, and storage space eliminate most of the agonies involved with managing a large pile of floppy disks. If you're interested in setting up a personal bulletin-board system, the purchase of a hard-disk system should be your top priority.

I selected four hard-disk systems from the pages of BYTE and other computer periodicals. My only criterion was that the complete system must cost less than $1000. This article by no means exhausts all the under-$1000 hard disks advertised, but it should give you an idea of some possible trade-offs and troubles if you decide that a hard disk is your PC's next peripheral. Performance and price information is provided in table 1.

The Sider

The Sider is from First Class Peripherals, a Carson City, Nevada, company. An external drive, it is consequently the easiest of the four to install. This also means that the drive has its own power supply; the only added power burden to your PC is the interface card. Additionally, since the Sider does not replace one of your system's floppy-disk drives (all of the internal drives reviewed install in place of one floppy-disk drive), you lose no functionality when you need 
to, say, copy one floppy disk to another. Best of all, you are spared the trouble of digging through the technical manuals to discover which switches on the PC's motherboard you have to flip to configure the IBM as a one-floppy system.

The Sider comes in a rather large (7 1/2 inches tall, I6 1/2 inches long, and 3 1/2 inches wide) creamwhite molded-plastic housing. The hard disk is mounted on its side, and the mechanism is convection-cooled via the case's slotted top. (This slotted top warrants caution: Small objects and certainly fluids could be unwittingly dropped into the inner workings of the unit, inflicting heaven knows what damage.) Since the unit is taller than it is wide, I experienced a notunjustified fear of knocking it over. A rather stiff but comfortably long cable connects the drive to the interface card. The installation and operation guide that comes with the Sider is a small 31 -page booklet. It is clear and easy to read, obviously written for people with an absolute minimum of hardware knowledge. It includes numerous illustrations of what goes where an

Sí. Menos de mil dólares (más de dos mil quinientos de hoy con la inflación) es «inexpensive». ¿Por qué capacidades? 800 dólares te dan un disco externo (súper llevable: 19 por 42 por 9 centímetros, más o menos; no me atrevo a consultar el peso) de diez megas y que «solo» hace falta encender 30 segundos antes que el ordenador (lo juro, haced clic en la imagen, pasad página y leed). Uno de los internos, el SyQuest (compañía que duraría hasta su bancarrota en 1998), llega a la barbaridad de 30 megabytes #madreDelAmorHermoso. Y si hay que economizar, tenéis el Rodime, que os da 10 megas por apenas 500 dólares. Me los quitan de las manos. Bendita ley de Moore (y familia).

¿Otra cosa que no es exactamente reciente? Dame un problema, no importa qué problema, y alguien te lo resolverá con una hoja de cálculo:

Circuit Design with Lotus 1-2-3

Use the famous spreadsheet to design circuits and print out schematics

by John L. Haynes

SPREADSHEETS, especially those with graphics, are not just for business applications; they can be of great help to circuit designers or anyone else designing systems that can be described by equations.

As an example, let's take a look at the application of one spreadsheet, Lotus 1-2-3, to one technical problem, electronic circuit design and analysis. We'll look at both digital and linear circuits.

Digital Circuits

Digital circuits are built from logic building blocks— inverters, NAND gates, flip-flops, etc. We can simulate each of these components with the equations in a cell of a spreadsheet, using the spreadsheet's built-in logical operators shown in figure 1. For instance, in the spreadsheet portion of Lotus 1-2-3, the equivalent of an inverter is the operator #NOT#, structured as #NOT#(A= 1). This structure means the state of the operator #NOT# is not true, or equal to a logical 0, if the state in the parentheses is true. This is equivalent to the output of an inverter circuit whose input is A. Similarly, the model of a NAND gate, #NOT# (A=1#AND#B = 1). is not true if input A and input B are both true. The flip-flop is a bit more complex, since its output depends not only on its input conditions but on the transition of a clock pulse. For simplicity, let's assume that there is a narrow clock pulse that triggers the flip-flop whenever the clock pulse is true— in other words, whenever its logic state is a logical 1. The Q output remains in its present state until the clock is true; it then assumes the state of the input D. The O' output is the logical opposite of Q.

These actions are easily simulated using the logical @IF function. It is structured as @IF(AB,C) and means IF A THEN B ELSE C. That is, if the logical condition of A is true, then the function equals B. Otherwise, the function equals C. Setting the variables as @IF(C= 1 , D,Q). we can interpret the state of the function as: If the clock C is true, the state is equal to D; otherwise, it remains Q. The Q' output is handled with the #NOT# operator.

Given the ability to simulate logic components with spreadsheet functions and operators, let's now look at how we can use this technique to build a simple digital circuit. The synchronizing circuit of figure 2 is a commonly encountered arrangement. Known variously as an edge detector, a synchronizing circuit, and a digital differentiator, it develops a pulse one clock period long when an external,

Diseño de circuitos electrónicos con Lotus 1-2-3. En serio. No es una inocentada. O sí, pero suprema.

Y recupero mi tema fetiche, «cosas que ni en broma se publicarían hoy en día en una revista generalista»:

One Million Primes Through the Sieve

Generate a million primes on your IBM PC without running out of memory

by T. A. Peng

A POPULAR WAY to benchmark microcomputers is with the Sieve of Eratosthenes. It is a simple and effective method for generating prime numbers. However, if you try to use the Sieve to obtain more than a few thousand primes on your IBM PC, you will soon encounter the dreaded phrase, "Out of memory." You would think, then, that as far as microcomputers are concerned, the Sieve of Eratosthenes would be an impractical way to generate a large number of primes. This is not so. Let me show you how to use the Sieve to generate a million primes on your microcomputer. Listing 1 (written in Microsoft BASIC) illustrates how, with very little memory, you can put 500.000 numbers through the Sieve to obtain all the primes less than 1,000,000. The idea is quite simple. Use an array of flags to represent the first 1000 odd numbers. After the nonprimes among them have been sieved out, reinitialize the array to represent the next 1000 odd numbers. Lines 120 through 140 initialize the array and lines 340 through 360 reinitialize it before you use it for the next 1000 numbers. The largest prime whose square is less than 1,000,000 is 997 and it is the 168th primestarting with the prime 2. To generate all the primes less than 1,000,000, you don't have to use primes larger than 997, This is the reason for line 220 and for the size of two of the arrays in line 110. The loop in lines 240 through 270 flags all numbers less than 1000 that do not yield primes. (We have K = I + nP, so that K + K + 1 = (I + I + 1) + 2nP = P(2n + 1), which is not a prime.) After each loop is executed, the value of K will be greater than 1000 (and K would flag the next number if the size of the array were larger) and this is remembered as K(C). The variable C keeps count of the primes generated with C - 1 as the actual number of primes generated at the end of each loop. Line 390 assures that the value of K lies between 1 and 1000. You need line 460 to give the correct value for the prime Q in line 490. All the variables except C, Q, and R are integer-valued. There is a reason for this. If the program executes correctly, the output of line 540 should read, "999,983 is the 78,498th prime and the largest less than 1,000,000."

It is clear how to modify listing 1 to generate all the primes less than 2,000,000 or even 10,000,000, but to get a predetermined number of primes, we need to know a little about their distribution. Specifically, what we need to know is the size of the arrays K and P and the largest prime to be used in the Sieve. And in order to know this, we must have a rough idea of how large the...

La criba de Erastótenes, amigas y amigos. Que, por cierto, no es un algoritmo especialmente complicado de entender (dejamos como ejercicio para la lectora girar la página e intentar entender el código en BASIC de la siguiente página :-)). Ahora me han enrado ganas de comprobar cuánta RAM consume el programita en Python que genera ChatGPT en menos tiempo del que necesitarías para teclear las tres primeras líneas del programa propuesto en la revista… pero no las suficientes como para hacerlo de verdad O:-).

Y para cerrar… la multitarea:

Top View

IBM's long-awaited multitasking program makes its debut

BY TJ Byers

TOPVIEW is a multitasking program that, for $149, enables your IBM Personal Computer to install more than one program in the system. This is different from the window programs that presently claim to accomplish the same thing. When working with windows, you must quit a program before you can begin another. With TopView, however, you don't have to quit either one of them. Both can be resident on the screen— and. more important, in the microprocessor—at the same time.

Multitasking

TopView's multitasking capabilities allow several programs to run simultaneously (see photo 1). This isn't the same thing as switching between programs without quitting them; it means that you can actually have one program running in the background while using another. Let's say, for example, that you need to calculate a large spreadsheet, and the job will take several minutes. Instead of staring idly at the screen while the computer crunches away, you can banish the spreadsheet to TopView's background mode and proceed to work on another program— the computer will handle both tasks at the same time. While one program is making calculations in the background, the other can be receiving data from the keyboard. You lose no time waiting for one program to finish before you start the other.

Multitasking is not a new concept. Mainframe computers have used multitasking for many years to enhance their performance. What is new, however, is putting multitasking capabilities into a personal computer.

TopView brings multitasking to the IBM PC using a multiplexing technique known as time slicing. Basically, TopView divides the microprocessor's time into slots during which it switches rapidly from one program to another. The time slices are very short, on the order of milliseconds, and the switching action is not apparent to either the application program or the user, so the programs appear to be running concurrently on the machine. In actuality, they are processed consecutively in very quick order. The procedure gives a single computer the ability to run more than one program at a time.

Multitasking is not without its faults, however. While one program is being processed, the others are held in suspension. Consequently, the programs tend to run more slowly. The more programs you have running at the same time, the slower each apparently becomes. A quick benchmark test using TopView to conduct a simple word search of Writing Assistant on an IBM PC AT showed that it took a full 14 seconds to search a typical 3000-word file as...

Y es que, en 1985, que un ordenador personal fuese capaz de ejecutar múltiples programas en paralelo no era exactamente trivial. Tanto no lo era que no resultaba descabellado cobrar 150 dólares por el programa para hacerlo. Aunque te redujese un 75% el rendimiento del software (cosa que solo ibas a notar cuando ejecutases programas intensivos en cálculo, claro, pero eras tú quien tenía que pensar en ello) o se te comiese buena parte de la RAM del ordenador.

Por cierto: las interfaces «de ventanas» de la época no tenían precio (aunque, de hecho, hoy se están poniendo los programas «TUI», en un maravilloso retorno al pasado :-)).

Un par de fotos de los intentos de mostrar varias aplicaciones en pantalla usando una interfaz puramente textual. No me veo capaz de hacer una descripción fidedigna.

En fin, lo dejamos aquí, que vamos tarde. El mes que viene Dentro de unos días (seguramente semanas), más.

Como de costumbre, tenéis los archivos de la revista Byte en archive.org, y si queréis, podéis ir avanzando trabajo con el número de diciembre.

Everything is an AI remix

Llevaba yo tiempo (meses) con una pestaña del navegador abierta en la última versión de ese maravilloso vídeo que es el Everything is a remix

…con la intención de buscarle la relectura «en tiempos de IA generativa». Y resulta ser que en el último enlace del #67 de Interaccions, la maravillosa newsletter de Sergi Muñoz Contreras, este me descubre que, de hecho, esa relectura ya se la ha hecho su propio autor, Kirby Ferguson. Y después de verla, opino que es de lo mejor que se ha dicho sobre el tema (y es que en muchísimas ocasiones, incluso cuando el discurso se alinea con mis ideas, me parece que la capacidad de argumentación es escasa).

Mi recomendación es que, aunque vieseis alguna de las múltiples versiones del Everything en su momento, la reveáis ahora con calma antes de ver el segundo vídeo.

Y solo añadiré que, siendo uno muy, muy fan del Everything, y estando bastante de acuerdo (pero no del todo) con la relectura, creo que se salta lo que yo llamo «el problema del aprendiz» (que seguro que tiene un nombre mejor): pareciéndome determinados (muchos, incluso, potencialmente) usos de la IA como herramienta (como los propuestos en AI is remixing, por ejemplo) lícitos (estoy obviando los charcos tamaño océano de la sostenibilidad y del respeto a la propiedad intelectual, ciertamente), la IA es un arma de destrucción masiva del proceso de aprendizaje que convierte al aprendiz en maestro, y que eso es algo que me aterra. Creo que lo resolveremos algún día. Pero me aterra.

Scott y Mark y la enseñanza de la programación

Scott Hanselmann es un tipo majísimo, con un podcast muy recomendable (de los últimos episodios, me quedo con el del Interim Computer Museum o el de Anne-Laure Le Cunff, pero el nivel medio es alto). Scott es, además, el vicepresidente de «developer community» de una pequeña compañía llamada Microsoft (cosa que le permitió abrir el código del BASIC de Microsoft para el 6502, que enseñó a programar a mucha gente en el Apple II, y los Commodore PET, VIC 20 y Commodore 64, con el que aprendí yo, y que con eso yo ya le estaría agradecido eternamente).

En Microsoft Scott conoce a Mark Russinovich, cuyo LinkedIn dice que es «CTO, Deputy CISO and Technical Fellow, Microsoft Azure«, pero que igual os suena más (a los que tengáis una edad, os guste la informática y uséis Windows) de SysInternals. Y Scott y Mark tienen otro podcast, muy apañado también, Scott and Mark Learn To…, que en los últimos episodios ha hablado bastante de un producto que vende intensamente Microsoft, la programación con IA generativa. Y de todos esos episodios, me quedo con este fragmento y lo que dice Russinovich hacia el final. Os dejo el vídeo en el momento indicado, acompañado de la transcripción en inglés, primero, y la traducción al español, después.

Solo comentar antes, que…

  • ….hablan de informática, pero aplica a muchos otros campos, si es que no a todos,
  • que no es la opinión más original del mundo, pero está bien que lo diga quien lo dice,
  • que lo de que las universidades no tienen un buen modelo es rigurosamente cierto, pero a ver quién es el guapo o guapa al que se le ocurre una solución que no sea brutalmente intrusiva o imposiblemente cara,
  • y que destaco un fragmento de la conversación, pero que también está muy bien (traducido: alineado con lo que pienso yo, que es un tema que también me toca de relativamente cerca) sobre las empresas y lo que buscan / deben buscar al contratar gente joven, y que en general todo el episodio, y todo el podcast, son bastante recomendables.

Y eso, os dejo con el vídeo, la transcripción y la traducción.

Otro día, más.

Transcripción

(A partir de la transcripción de YouTube, corregida y editada ligeramente por mí para hacerla algo más legible (espero). Las negritas son mías.)

—So as we end… we’re at an inflection point. What should university students that are studying CS right now, sophomores, juniors, seniors, in CS, be thinking about as we are in that point?

— I have a friend that’s got a student in computer science that’s a junior and he said he was talking to them and said asking them, do you use AI and he says, like, yeah a lot of my fellow students are using AI. I don’t use AI, because I want to learn, the hard way.

— I think both is the right answer, though, I feel.

— I think both, but here’s what I’ll tell you right now. I think that universities don’t have a good model, you know, consistent.

— They’re behind. Academia might, but research level academia.

— But not for teaching undergrads. And, actually, I think what is coming into view for me is that you need classes where using AI for certain projects or certain classes is considered cheating. Not to say that you don’t have classes and projects in some classes where the student is told to use AI, but you need to have the main basis for the education on computer science and programming to be AI-less, because that’s the only way the student’s going to learn.

— I’ve been saying «drive stick shift». And I get told that I’m being gatekeepy when I say that.

— I don’t think you are, because there is a great study of three months ago from MIT where they took, um, not students, but they took people in careers, already in the workforce, and they divided them into three cohorts and had them write essays from the SAT, and they had one cohort just do it with their own closed book, just write the essay. They had another cohort that got to use Google, and they had another cohort that got to use ChatGPT, and they looked at their EEGs, and they quizzed them afterwards, right after, and then like a week later, and the results were exactly what you would expect. The people that wrote it could answer questions about what they wrote, even a week later, and their EEGs showed that they were burning a lot of wattage. The people that were using ChatGPT, an hour after they wrote the essay, they couldn’t remember what they’d written.

— That’s the thing. It’s just not even there. That makes me really sad. I very much enjoy using AI to brainstorm, to plan, but then I want to do the writing part. To vibe your way through life has me concerned.

— You lose the critical thinking. And they call this critical thinking deficit, that is what it’s creating…

— Which we already have from social media.

— Yeah, we already have. And if you’re talking about the early and career programmers that we’ve been talking about wanting to hire at a company, you want them to know what a race condition is. You don’t want them to have vibed it and AI is like, «Yeah, a race condition. AI will fix that.» Because at some point, as we’ve said, I think with the limitations of AI and software programming, at least for the foreseeable future somebody needs to know.

Traducción

(Con ChatGPT y revisado por mí. Las negritas siguen siendo mías.)

—Así que, para cerrar… estamos en un punto de inflexión. ¿Qué deberían estar pensando los estudiantes universitarios que estudian informática ahora mismo?

—Tengo un amigo que tiene un hijo que estudia informática, está en tercer año, y me dijo que le preguntó: “¿Usas IA?” Y él respondió: “Sí, muchos de mis compañeros la usan. Yo no, porque quiero aprender por el camino difícil.”

—Creo que ambas cosas son la respuesta correcta, sinceramente.

—Sí, ambas, pero te diré algo: creo que las universidades no tienen un modelo adecuado, coherente.

—Van por detrás. Quizás la academia investigadora sí, pero…

—Pero no en la enseñanza de grado. De hecho, creo que lo que se está haciendo evidente es que necesitamos clases en las que usar IA para ciertos proyectos o asignaturas se considere hacer trampa. No porque no debas tener otras clases o proyectos donde se indique explícitamente al estudiante que use IA, sino porque la base principal de la formación en informática y programación debe ser sin IA, porque es la única forma en que el estudiante realmente aprenderá.

—Yo suelo decir “hay que aprender a conducir con cambio manual”. Y me dicen que eso es una postura elitista1.

—No creo que lo sea, porque hay un estudio excelente de hace tres meses del MIT donde tomaron… no estudiantes, sino profesionales ya en activo, y los dividieron en tres grupos para que escribieran ensayos del tipo de la selectividad. A un grupo le dijeron que lo hiciera sin ayuda, a otro que podía usar Google, y a otro que podía usar ChatGPT. Luego midieron sus electroencefalogramas y los evaluaron justo después y una semana más tarde. Los resultados fueron exactamente los que esperarías: las personas que escribieron el ensayo por sí mismas eran capaces de responder preguntas sobre lo que habían escrito incluso una semana después, y sus elecroencefalogramas mostraban mucha actividad cerebral. En cambio, quienes usaron ChatGPT, una hora después ya no recordaban lo que habían escrito.

—Eso es. Es que ni siquiera está ahí. Y eso me pone muy triste. Me gusta mucho usar la IA para generar ideas, para planificar, pero luego quiero escribir yo. Esa actitud de “vibear”2 la vida me preocupa.

—Se pierde el pensamiento crítico. Y eso está generando un déficit de pensamiento crítico…

—Que ya teníamos por culpa de las redes sociales.

—Sí, ya lo teníamos. Y si hablamos de los programadores jóvenes o principiantes que queremos contratar en una empresa, quieres que sepan lo que es una condición de carrera (race condition). No quieres que lo hayan “vibeado” y que la IA les diga: “Sí, una condición de carrera, la IA lo arreglará.” Porque, como ya hemos dicho, con las limitaciones de la IA en la programación de software, al menos en el futuro cercano, alguien tiene que saberlo.

  1. «Gatekeepy» en el original. En este caso «to gatekeep» sería poner barreras de acceso innecesarias, o «pedir carnets». ↩︎
  2. «Vibear» es mi traducción de «to vibe code«, crear programas a base de prompts a IAs generativas, sin escribir una línea de código. ↩︎

Byte, octubre del 85

Portada de la revista Byte de octubre de 1985. El tema de portada es Simulating Socienty. Lo ilustra una hoja de papel de impresora que envuelve unas caras humanas.

Vamos allá con nuestra relectura de lo último en informática…de hace cuarenta años, a través de los archivos de la revista Byte en archive.org. Hoy toca octubre de 1985.

Para comenzar, no os quejéis de que no estáis presenciando los grandes avances de la historia. Os presento… ¡el disquete de alta densidad! (Creo que la mayoría de los que me leéis ya sois talluditos y apreciaréis que saltar de 720 kilobytes a 1.44 megas, sin ser revolucionario, sí fue todo un salto.)

Sony, Toshiba Prepare High-Density 3 ½ inch Disks

Sony announced in Tokyo that it has developed a 2-megabyte 3½ inch floppy disk, storing 1.6 megabytes (formatted) by doubling the number of sectors per track. The 2-megabyte medium uses a 1 micron magnetic layer (half the thickness of current 1 -megabyte disks) and requires a higher coercivity (700 rather than 600-620 oersteds).

While the 2-megabyte versions use the same magnetic technology as earlier 3 ½-inch disks and drives, the magnetic heads of the drives require higher tolerances. An additional disk cartridge hole allows drives to distinguish between 1- and 2-megabyte disks.

Although it has already licensed 38 companies to produce 2-megabyte disks, Sony says it is waiting for formal standards to be set before marketing the disks and drives, which should be available to OEMs next year, probably at prices about 20 percent higher than 1-megabyte versions.

An even denser 3 ½-inch drive from Toshiba uses perpendicular recording technology to squeeze 4 megabytes of data onto a single-sided disk coated with barium ferrite. Toshiba plans to release evaluation units early next year, with full production slated for 1987

While the 2-megabyte versions use the same magnetic technology as earlier 3 '/2-inch disks and drives, the magnetic heads of the drives require higher tolerances. An additional disk cartridge hole allows drives to distinguish between 1- and 2-megabyte disks.

Although it has already licensed 38 companies to produce 2-megabyte disks, Sony says it is waiting for formal standards to be set before marketing the disks and drives, which should be available to OEMs next year, probably at prices about 20 percent higher than I -megabyte versions.

An even denser 3 '/2-inch drive from Toshiba uses perpendicular recording technology to squeeze 4 megabytes of data onto a single-sided disk coated with barium ferrite. Toshiba plans to release evaluation units early next year, with full production slated for 1987.

Que levante la mano quien supiese / recordase que antes de Access, la base de datos de Microsoft (que no llegaría hasta 1992), hubo un Microsoft Access para conectarse a servicios de información a través del módem (yo no tenía ni idea / no lo recordaba en absoluto). La hegemonía del Access base de datos es tal que apenas he sido capaz de encontrar más información al respecto.

Anuncio de Microsoft Access. Lo ilustra un ordenador sobre el que hay el auricular de un teléfono de sobremesa, roto por la mitad. El titular es Don't get mad, get Access

En nuestra habitual sección «crees que esto se acaba de inventar, pero no» tenemos a la sección de libros, que se hace eco de Computer culture : the scientific, intellectual, and social impact of the computer, disponible, como no, en archive.org, que recogía las ponencias de la conferencia del mismo nombre, porque no es solo en Despacho 42 que nos preocupamos de estos temas y que, naturalmente, ya se preocupaba del impacto de la IA…

Artificial Intelligence

Approximately one-fourth of Computer Culture (four papers and one panel discussion) deals specifically with artificial intelligence. The panel discussion on the impact of Al research is the most thought-provoking contribution in the book. As you might expect, this discussion is not so concise as an article dealing with the same topic, but the interaction among the panel members is intriguing. The panel consists of two philosophers (Hubert Dreyfus and John Searle) and three computer scientists (John McCarthy, Marvin Minsky, and Seymour Papert). Much of the discussion is spent identifying important questions about Al. Each panelist has a distinct viewpoint, resulting in a diversity of questions. Among these, however, two issues are of overriding concern: Can machines think? If they can, is machine thinking the same as human thinking?

The panelists seem to agree that computers can be used to study thinking, if for no other reason than to provide a contrast with human thought processes. On the other hand, the suggestion that appropriately programmed computers could duplicate human thought processes is much more controversial.

Aside from the philosophical issues, Papert makes a very important point when he argues that it is dangerous to reassure people that machines will never be able to challenge the intellectual capabilities of human beings. If people are lulled into a sense of security about machine capabilities, they will be ill prepared to deal with situations in which machines become better than people at doing specific jobs, he says. Whether or not the machines are described as thinking in these situations, the social and psychological issues raised by machine capabilities demand attention.
(Enlazo a la página de portada de la sección de libros, en vez de la específica del fragmento que tenéis aquí. En cualquier caso, vale la pena leer la crítica completa… e incluso el libro, si tenéis la oporunidad)

Más cosas que no se inventaron ayer. Uno ve poco fútbol del de darle patadas a un balón, pero bastante fútbol americano, un deporte en que las retransmisiones no serían lo mismo sin la obligatoria skycam, ua cámara que sobrevuela el terreno de juego colagada de cuatro cables. Y sí, cumple cuarenta años:

Skycam: An Aerial Robotic Camera System

A microcomputer provides the control to add three-dimensional mobility to TV and motion picture cameras

On a morning in March 1983, a group of technicians gathered at Haverford High School in a suburb of Philadelphia. Each brought an electrical, mechanical, or software component for a revolutionary new camera system named Skycam (see photo 1). Skycam is a suspended, mobile, remote-controlled system designed to bring three-dimensional mobility to motion picture and television camera operation. (See the text box on page 128.) I used an Osborne 1 to develop Skycam's control program in my basement, and it took me eight months of evenings and weekends. As of 3 a.m. that morning, however, the main control loop refused to run. But 19 hours later, Skycam lurched around the field for about 15 minutes before quitting for good. Sitting up in the darkness of the press booth, hunched over the tiny 5-inch screen, 1 could see that the Osborne 1 was not fast enough to fly the Skycam smoothly.

In San Diego 18 months later, another group of technicians opened 20 matched shipping cases and began to get the Skycam ready for an NFL preseason game between the San Diego

Chargers and the San Francisco FortyNiners. The Skycam was now being run by an MC68000 microprocessor based Sage computer, and a host of other improvements had been made on the original. [Editor's note: The Sage Computer is now known as the Stride: however, the machine used by the author was purchased before the company's name change. For the purpose of the article, the machine will be referred to as the Sage.] For the next three hours, Skycam moved high over the field fascinating the fans in the stadium while giving the nationwide prime-time TV audience their first look at a new dimension in sports coverage.

Skycam represents an innovative use of microcomputers. The portable processing power needed to make Skycam fly was unavailable even five years ago. That power is the "invention" upon which the Skycam patents are based. It involves the support and free movement of an object in a large volume of space. The development team used the following experiment to test the movement and operation of the Skycam.

At a football field with one lighting tower at each of four corners, the team members bolted a pulley to the top of each pole, facing inward. Then they used four motorized winches, each with 500 feet of thin steel cable on a revolving drum and put one at the base of each tower.

Next, they ran a cable from each motor to the top of its tower and threaded the cable through the pulley. They pulled all four cables from the tops of the towers out to the middle of the field and attached the cables to a metal ring 2 feet in diameter weighing 10 pounds (see figure 1). A motor operator was stationed at each winch with a control box that enabled the operator to slowly reel in or let out the cable. Each motor operator reeled the cable until the ring was suspended a few feet from the ground, and then they were ready to demonstrate Skycam dynamics.

All four motor operators reeled in the cable. The ring moved upward quickly. If all four motors reel in at the same rate (and the layout of lighting towers is reasonably symmetrical) the ring will move straight up. In the experiment, the two motors on the left reeled in and the two on the right reeled out. The ring moved to the left and maintained its altitude. An instruction was given to the two motor operators on the left to reel out and the two on the right to reel in just a little bit. The ring moved right and descended as it moved back toward the center.

The theoretical basis of this demonstration is quite simple. For each point in the volume of space bounded by the field, the four towers and the plane of the pulleys, there is a unique set of four numbers that represents the distances between that point and each of the four pulley positions. Following the layout above for an arbitrary point on the field, you can...

Pero este mes me quedo con el tema de portada: el uso de simulaciones informáticas para modelar la sociedad:

Simulating Society

THE NEED FOR GREATER RIGOR in the social sciences has long been acknowledged. This month's theme examines computer-based simulation as a means to achieving that end. Simulation may be able to assist in evaluating hypotheses, not in the sense that an experiment in the physical sciences can test a hypothesis, but in the sense of making plain the ramifications of a hypothesis. The value of specifying a hypothesis with sufficient clarity to be amenable to programming and of examining the consequences of that hypothesis should not be underestimated. Indeed, one of the interesting aspects of the work presented here is that these researchers appear to be developing a tool for the social sciences that is not simply a poor stepchild of physical science methodologies.

Our first article, "Why Models Go Wrong" by Tom Houston, is a wonderfully readable account of the ways that you can misuse statistics.

Next, Wallace Larimore and Raman Mehra's "The Problem of Overfitting Data" discusses a difficult but important topic. Overfitting happens when your curve traces the noise as well as the information in your data. The result is that the predictive value of the curve actually deteriorates.

In "Testing Large-Scale Simulations," Otis Bryan and Michael Natrella show how validation (determining whether the specification for the simulation corresponds with reality) and verification (determining whether the simulation program corresponds with the specification) were achieved on a large-scale combat simulation they developed for the Air Force.

The ways of economic modeling are illustrated by Ross Miller and Alexander Kelso, who show how they analyzed the effects of proposed taxes for funding the EPA Superfund in "Analyzing Government Policies."

Michael Ward discusses his ongoing research in simulating the U.S.-Soviet arms race in "Simulating the Arms Race."

Several authors discuss new and surprising applications of simulation. In "EPIAID," Dr. Andrew Dean describes the development of computer-based aids for Centers for Disease Control field epidemiologists. Royer Cook explains how he fine-tuned a model in "Predicting Arson," and Bruce Dillenbeck, who uses an arson-prediction program in his work as a community activist, discusses modeling in "Fighting Fire with Technology"

Articles in other sections of the magazine that relate to this theme include Zaven Karian's review of GPSS/PC and Arthur Hansen's Programming Insight "Simulating the Normal Distribution."

When I began researching this theme, I took an excellent intensive course in simulation from Edward Russell of CACI. Dr. Russell's is the unseen hand guiding the development of this theme. Of course, any blame for bias in the choice of theme topics belongs to me, but much of the credit for the quality that is here must reside with him.

No os perdáis los artículos sobre los problemas, comenzando por los dos que abren la sección, sobre los riesgos del mal modelado (un tema que, desafortunadamente, tiene hoy todavía más importancia que hace cuarenta años), y siguiendo con el de modelado económico con Lotus 1-2-3, o el de epidemiología.

Ah, y aprovechando que la cosa iba de modelado… ¿sabíais que SPSS/PC+, no solo ya existía en 1985, sino que fue lanzado en 1968? Si a alguien se le ocurre un software que lleve más tiempo en el mercado, que avise.

Anuncio del programa SPSS/PC+. El eslogan es Make Stat Magic. Lo ilustra la foto de un sombrero de copa, como los de los magos, del que sale un disquete de 5¼ etiquetado SPSS/PC+

Y no vamos a dejar de hablar del Amiga, claro. Esta vez, es Bruce Webster, otro de los columnistas estrella de la revista, el que nos explica lo mucho que ha alucinado con la potencia, el precio y la elegancia del sistema:

According to Webster

Commodore's Coup

Product of the Month: Amiga

Last month, I made a few comments about the future of the home computer market, based on rumors I had heard about the Amiga from Commodore. In essence, I said that if what I had heard was true the Amiga might be the heir to the Apple II in the home/educational/small business marketplace.

Since writing that. 1 have seen the Amiga. I have watched demonstrations of its abilities; I have played with it myself; and I have gone through the technical manuals. My reaction: I want to lock myself in a room with one (or maybe two) and spend the next year or so discovering just what this machine is capable of. To put it another way: I was astonished. Hearing a description of a machine is one thing, seeing it in action is something else especially where the Amiga is concerned

I can tell you that the low-resolution mode is 320 by 200 pixels, with 32 colors available for each pixel (out of a selection of 4096). But that does not prepare you for just how stunning the colors are especially when they are properly designed and combined. It also doesn't tell you that you can redefine that set of 32 colors as the raster-scanning beam moves down the screen, letting you easily have several hundred colors on the screen simultaneously.

It also doesn't tell you how blindingly fast the graphics hardware is. If you've seen some of Commodore's television commercials demonstrating the Amiga's capabilities, or if you've looked at the machine yourself, you have some idea as to what the machine can do. If you haven't, I'm not sure I can adequately describe it.

Having seen the graphics on the Amiga, I have to smile when I hear people lump it together with the Atari 520ST. The high resolution mode on the ST is 640 by 400 pixels with 2 colors (out of 512); on the Amiga, it is 640 by 400 pixels with 16 colors (out of the 4096). and you can redefine those 16 colors as the raster-scanning beam goes down the screen. Also, the graphics hardware supporting all those colors is much faster. Little wonder, then, that a friend of mine, a game developer with several programs on the market, came back from the Amiga developers' seminar with plans to return the Atari ST development system at his house and to turn his attentions to the Amiga instead.

As I guessed last month, the real strength of the Amiga is its totally open architecture. An 86-pin bus comes out of one side of the machine, giving any add-on hardware complete control of the machine What's more 512 K bytes of the 68000's 16-megabyte address space have been set aside for expansion hardware, 4K bytes each for 128 devices. A carefully designed protocol tells hardware manufacturers what data they should store in ROM (read-only memory) so that the Amiga can automatically configure itself when booted. This is a far cry from the closed-box mentality of the Macintosh, which has forced many hardware vendors through weird contortions just to get their devices to talk consistently to the Mac without crashing.

The memory map is well thought out. The Amiga comes with 256K bytes of RAM (random-access read/write memory); an up...

Snif.

Si os lo leéis entero, por favor no os asustéis cuando lleguéis al momento en que comenta que la RAM está a 350 dólares (algo más de mil, actualizando la inflación) por 256 kilobytes. Vamos, que lo por lo que costaban 256 kilobytes hoy te puedes comprar unos 320.. gigabytes. Un millón a uno. (Y supongo que no os sorprenderá mucho comprobar que los márgenes de beneficio de Apple al vender RAM para sus sistemas no son una cosa del siglo XXI.)

Y lo dejamos aquí por este mes. Nos vemos el mes que viene, con el número de noviembre.