Byte, octubre del 85

Portada de la revista Byte de octubre de 1985. El tema de portada es Simulating Socienty. Lo ilustra una hoja de papel de impresora que envuelve unas caras humanas.

Vamos allá con nuestra relectura de lo último en informática…de hace cuarenta años, a través de los archivos de la revista Byte en archive.org. Hoy toca octubre de 1985.

Para comenzar, no os quejéis de que no estáis presenciando los grandes avances de la historia. Os presento… ¡el disquete de alta densidad! (Creo que la mayoría de los que me leéis ya sois talluditos y apreciaréis que saltar de 720 kilobytes a 1.44 megas, sin ser revolucionario, sí fue todo un salto.)

Sony, Toshiba Prepare High-Density 3 ½ inch Disks

Sony announced in Tokyo that it has developed a 2-megabyte 3½ inch floppy disk, storing 1.6 megabytes (formatted) by doubling the number of sectors per track. The 2-megabyte medium uses a 1 micron magnetic layer (half the thickness of current 1 -megabyte disks) and requires a higher coercivity (700 rather than 600-620 oersteds).

While the 2-megabyte versions use the same magnetic technology as earlier 3 ½-inch disks and drives, the magnetic heads of the drives require higher tolerances. An additional disk cartridge hole allows drives to distinguish between 1- and 2-megabyte disks.

Although it has already licensed 38 companies to produce 2-megabyte disks, Sony says it is waiting for formal standards to be set before marketing the disks and drives, which should be available to OEMs next year, probably at prices about 20 percent higher than 1-megabyte versions.

An even denser 3 ½-inch drive from Toshiba uses perpendicular recording technology to squeeze 4 megabytes of data onto a single-sided disk coated with barium ferrite. Toshiba plans to release evaluation units early next year, with full production slated for 1987

While the 2-megabyte versions use the same magnetic technology as earlier 3 '/2-inch disks and drives, the magnetic heads of the drives require higher tolerances. An additional disk cartridge hole allows drives to distinguish between 1- and 2-megabyte disks.

Although it has already licensed 38 companies to produce 2-megabyte disks, Sony says it is waiting for formal standards to be set before marketing the disks and drives, which should be available to OEMs next year, probably at prices about 20 percent higher than I -megabyte versions.

An even denser 3 '/2-inch drive from Toshiba uses perpendicular recording technology to squeeze 4 megabytes of data onto a single-sided disk coated with barium ferrite. Toshiba plans to release evaluation units early next year, with full production slated for 1987.

Que levante la mano quien supiese / recordase que antes de Access, la base de datos de Microsoft (que no llegaría hasta 1992), hubo un Microsoft Access para conectarse a servicios de información a través del módem (yo no tenía ni idea / no lo recordaba en absoluto). La hegemonía del Access base de datos es tal que apenas he sido capaz de encontrar más información al respecto.

Anuncio de Microsoft Access. Lo ilustra un ordenador sobre el que hay el auricular de un teléfono de sobremesa, roto por la mitad. El titular es Don't get mad, get Access

En nuestra habitual sección «crees que esto se acaba de inventar, pero no» tenemos a la sección de libros, que se hace eco de Computer culture : the scientific, intellectual, and social impact of the computer, disponible, como no, en archive.org, que recogía las ponencias de la conferencia del mismo nombre, porque no es solo en Despacho 42 que nos preocupamos de estos temas y que, naturalmente, ya se preocupaba del impacto de la IA…

Artificial Intelligence

Approximately one-fourth of Computer Culture (four papers and one panel discussion) deals specifically with artificial intelligence. The panel discussion on the impact of Al research is the most thought-provoking contribution in the book. As you might expect, this discussion is not so concise as an article dealing with the same topic, but the interaction among the panel members is intriguing. The panel consists of two philosophers (Hubert Dreyfus and John Searle) and three computer scientists (John McCarthy, Marvin Minsky, and Seymour Papert). Much of the discussion is spent identifying important questions about Al. Each panelist has a distinct viewpoint, resulting in a diversity of questions. Among these, however, two issues are of overriding concern: Can machines think? If they can, is machine thinking the same as human thinking?

The panelists seem to agree that computers can be used to study thinking, if for no other reason than to provide a contrast with human thought processes. On the other hand, the suggestion that appropriately programmed computers could duplicate human thought processes is much more controversial.

Aside from the philosophical issues, Papert makes a very important point when he argues that it is dangerous to reassure people that machines will never be able to challenge the intellectual capabilities of human beings. If people are lulled into a sense of security about machine capabilities, they will be ill prepared to deal with situations in which machines become better than people at doing specific jobs, he says. Whether or not the machines are described as thinking in these situations, the social and psychological issues raised by machine capabilities demand attention.
(Enlazo a la página de portada de la sección de libros, en vez de la específica del fragmento que tenéis aquí. En cualquier caso, vale la pena leer la crítica completa… e incluso el libro, si tenéis la oporunidad)

Más cosas que no se inventaron ayer. Uno ve poco fútbol del de darle patadas a un balón, pero bastante fútbol americano, un deporte en que las retransmisiones no serían lo mismo sin la obligatoria skycam, ua cámara que sobrevuela el terreno de juego colagada de cuatro cables. Y sí, cumple cuarenta años:

Skycam: An Aerial Robotic Camera System

A microcomputer provides the control to add three-dimensional mobility to TV and motion picture cameras

On a morning in March 1983, a group of technicians gathered at Haverford High School in a suburb of Philadelphia. Each brought an electrical, mechanical, or software component for a revolutionary new camera system named Skycam (see photo 1). Skycam is a suspended, mobile, remote-controlled system designed to bring three-dimensional mobility to motion picture and television camera operation. (See the text box on page 128.) I used an Osborne 1 to develop Skycam's control program in my basement, and it took me eight months of evenings and weekends. As of 3 a.m. that morning, however, the main control loop refused to run. But 19 hours later, Skycam lurched around the field for about 15 minutes before quitting for good. Sitting up in the darkness of the press booth, hunched over the tiny 5-inch screen, 1 could see that the Osborne 1 was not fast enough to fly the Skycam smoothly.

In San Diego 18 months later, another group of technicians opened 20 matched shipping cases and began to get the Skycam ready for an NFL preseason game between the San Diego

Chargers and the San Francisco FortyNiners. The Skycam was now being run by an MC68000 microprocessor based Sage computer, and a host of other improvements had been made on the original. [Editor's note: The Sage Computer is now known as the Stride: however, the machine used by the author was purchased before the company's name change. For the purpose of the article, the machine will be referred to as the Sage.] For the next three hours, Skycam moved high over the field fascinating the fans in the stadium while giving the nationwide prime-time TV audience their first look at a new dimension in sports coverage.

Skycam represents an innovative use of microcomputers. The portable processing power needed to make Skycam fly was unavailable even five years ago. That power is the "invention" upon which the Skycam patents are based. It involves the support and free movement of an object in a large volume of space. The development team used the following experiment to test the movement and operation of the Skycam.

At a football field with one lighting tower at each of four corners, the team members bolted a pulley to the top of each pole, facing inward. Then they used four motorized winches, each with 500 feet of thin steel cable on a revolving drum and put one at the base of each tower.

Next, they ran a cable from each motor to the top of its tower and threaded the cable through the pulley. They pulled all four cables from the tops of the towers out to the middle of the field and attached the cables to a metal ring 2 feet in diameter weighing 10 pounds (see figure 1). A motor operator was stationed at each winch with a control box that enabled the operator to slowly reel in or let out the cable. Each motor operator reeled the cable until the ring was suspended a few feet from the ground, and then they were ready to demonstrate Skycam dynamics.

All four motor operators reeled in the cable. The ring moved upward quickly. If all four motors reel in at the same rate (and the layout of lighting towers is reasonably symmetrical) the ring will move straight up. In the experiment, the two motors on the left reeled in and the two on the right reeled out. The ring moved to the left and maintained its altitude. An instruction was given to the two motor operators on the left to reel out and the two on the right to reel in just a little bit. The ring moved right and descended as it moved back toward the center.

The theoretical basis of this demonstration is quite simple. For each point in the volume of space bounded by the field, the four towers and the plane of the pulleys, there is a unique set of four numbers that represents the distances between that point and each of the four pulley positions. Following the layout above for an arbitrary point on the field, you can...

Pero este mes me quedo con el tema de portada: el uso de simulaciones informáticas para modelar la sociedad:

Simulating Society

THE NEED FOR GREATER RIGOR in the social sciences has long been acknowledged. This month's theme examines computer-based simulation as a means to achieving that end. Simulation may be able to assist in evaluating hypotheses, not in the sense that an experiment in the physical sciences can test a hypothesis, but in the sense of making plain the ramifications of a hypothesis. The value of specifying a hypothesis with sufficient clarity to be amenable to programming and of examining the consequences of that hypothesis should not be underestimated. Indeed, one of the interesting aspects of the work presented here is that these researchers appear to be developing a tool for the social sciences that is not simply a poor stepchild of physical science methodologies.

Our first article, "Why Models Go Wrong" by Tom Houston, is a wonderfully readable account of the ways that you can misuse statistics.

Next, Wallace Larimore and Raman Mehra's "The Problem of Overfitting Data" discusses a difficult but important topic. Overfitting happens when your curve traces the noise as well as the information in your data. The result is that the predictive value of the curve actually deteriorates.

In "Testing Large-Scale Simulations," Otis Bryan and Michael Natrella show how validation (determining whether the specification for the simulation corresponds with reality) and verification (determining whether the simulation program corresponds with the specification) were achieved on a large-scale combat simulation they developed for the Air Force.

The ways of economic modeling are illustrated by Ross Miller and Alexander Kelso, who show how they analyzed the effects of proposed taxes for funding the EPA Superfund in "Analyzing Government Policies."

Michael Ward discusses his ongoing research in simulating the U.S.-Soviet arms race in "Simulating the Arms Race."

Several authors discuss new and surprising applications of simulation. In "EPIAID," Dr. Andrew Dean describes the development of computer-based aids for Centers for Disease Control field epidemiologists. Royer Cook explains how he fine-tuned a model in "Predicting Arson," and Bruce Dillenbeck, who uses an arson-prediction program in his work as a community activist, discusses modeling in "Fighting Fire with Technology"

Articles in other sections of the magazine that relate to this theme include Zaven Karian's review of GPSS/PC and Arthur Hansen's Programming Insight "Simulating the Normal Distribution."

When I began researching this theme, I took an excellent intensive course in simulation from Edward Russell of CACI. Dr. Russell's is the unseen hand guiding the development of this theme. Of course, any blame for bias in the choice of theme topics belongs to me, but much of the credit for the quality that is here must reside with him.

No os perdáis los artículos sobre los problemas, comenzando por los dos que abren la sección, sobre los riesgos del mal modelado (un tema que, desafortunadamente, tiene hoy todavía más importancia que hace cuarenta años), y siguiendo con el de modelado económico con Lotus 1-2-3, o el de epidemiología.

Ah, y aprovechando que la cosa iba de modelado… ¿sabíais que SPSS/PC+, no solo ya existía en 1985, sino que fue lanzado en 1968? Si a alguien se le ocurre un software que lleve más tiempo en el mercado, que avise.

Anuncio del programa SPSS/PC+. El eslogan es Make Stat Magic. Lo ilustra la foto de un sombrero de copa, como los de los magos, del que sale un disquete de 5¼ etiquetado SPSS/PC+

Y no vamos a dejar de hablar del Amiga, claro. Esta vez, es Bruce Webster, otro de los columnistas estrella de la revista, el que nos explica lo mucho que ha alucinado con la potencia, el precio y la elegancia del sistema:

According to Webster

Commodore's Coup

Product of the Month: Amiga

Last month, I made a few comments about the future of the home computer market, based on rumors I had heard about the Amiga from Commodore. In essence, I said that if what I had heard was true the Amiga might be the heir to the Apple II in the home/educational/small business marketplace.

Since writing that. 1 have seen the Amiga. I have watched demonstrations of its abilities; I have played with it myself; and I have gone through the technical manuals. My reaction: I want to lock myself in a room with one (or maybe two) and spend the next year or so discovering just what this machine is capable of. To put it another way: I was astonished. Hearing a description of a machine is one thing, seeing it in action is something else especially where the Amiga is concerned

I can tell you that the low-resolution mode is 320 by 200 pixels, with 32 colors available for each pixel (out of a selection of 4096). But that does not prepare you for just how stunning the colors are especially when they are properly designed and combined. It also doesn't tell you that you can redefine that set of 32 colors as the raster-scanning beam moves down the screen, letting you easily have several hundred colors on the screen simultaneously.

It also doesn't tell you how blindingly fast the graphics hardware is. If you've seen some of Commodore's television commercials demonstrating the Amiga's capabilities, or if you've looked at the machine yourself, you have some idea as to what the machine can do. If you haven't, I'm not sure I can adequately describe it.

Having seen the graphics on the Amiga, I have to smile when I hear people lump it together with the Atari 520ST. The high resolution mode on the ST is 640 by 400 pixels with 2 colors (out of 512); on the Amiga, it is 640 by 400 pixels with 16 colors (out of the 4096). and you can redefine those 16 colors as the raster-scanning beam goes down the screen. Also, the graphics hardware supporting all those colors is much faster. Little wonder, then, that a friend of mine, a game developer with several programs on the market, came back from the Amiga developers' seminar with plans to return the Atari ST development system at his house and to turn his attentions to the Amiga instead.

As I guessed last month, the real strength of the Amiga is its totally open architecture. An 86-pin bus comes out of one side of the machine, giving any add-on hardware complete control of the machine What's more 512 K bytes of the 68000's 16-megabyte address space have been set aside for expansion hardware, 4K bytes each for 128 devices. A carefully designed protocol tells hardware manufacturers what data they should store in ROM (read-only memory) so that the Amiga can automatically configure itself when booted. This is a far cry from the closed-box mentality of the Macintosh, which has forced many hardware vendors through weird contortions just to get their devices to talk consistently to the Mac without crashing.

The memory map is well thought out. The Amiga comes with 256K bytes of RAM (random-access read/write memory); an up...

Snif.

Si os lo leéis entero, por favor no os asustéis cuando lleguéis al momento en que comenta que la RAM está a 350 dólares (algo más de mil, actualizando la inflación) por 256 kilobytes. Vamos, que lo por lo que costaban 256 kilobytes hoy te puedes comprar unos 320.. gigabytes. Un millón a uno. (Y supongo que no os sorprenderá mucho comprobar que los márgenes de beneficio de Apple al vender RAM para sus sistemas no son una cosa del siglo XXI.)

Y lo dejamos aquí por este mes. Nos vemos el mes que viene, con el número de noviembre.

Jeff Minter y la historia de los videojuegos

A veces en el trabajo montamos cosas muy chulas… y luego a mí se me olvidan completamente. Hace ya una temporada que Joan Arnedo, entre otras cosas director del máster universitario en línea de Diseño y Programación de Videojuegos de la UOC, monta unas jornadas, RUMSXPLORA, dedicadas a conservar la memoria de los videojuegos de los 80. Para la edición del año pasado Joan buscaba ponente de lujo, y centrado en el mundo Commodore. Y discutiendo sobre el tema salió de mi boca el nombre del mítico Jeff Minter (si alguien saca un Llamasoft: The Jeff Minter Story y tu carrera en el mundillo arranca de Centipede para el ZX81, pasa por el Gridrunner para el Commodore 64 y el Attack of the Mutant Camels, sigue con cosas para el Amiga y el Atari, la Jaguar de Atari y llega hasta el día de hoy, eres una leyenda).

Y vaya usted a saber cómo, Joan engañó a Jeff para venirse a Barcelona y el resultado es esta charla. Y la charla, además, fue un fantástico repaso en primera persona a la historia y evolución de los videojuegos desde los muy primeros ochenta hasta hoy y, de regalo, contiene un recordatorio de por qué es importante conservar la memoria de aquellos programillas de cuando las memorias se medían en kilobytes.

(Estoy seguro de que en algún momento, relativamente a inicios del siglo XX, en que alguien propuso una filmoteca para conservar las primeras películas y que alguien le contestó que no había ningún valor en conservar una cosa tan burda como aquella. Es probable que si hubiésemos tenido un poco más vista, hoy conservaríamos más y mejores recuerdos de los inicios del cine. Uno, que tiene mucha fe, espera que con la historia de los videojuegos seamos más cuidadosos, pero la industria se lo trabaja todo lo que puede para quemar su propia historia (léase y léase).)

En fin, que no os perdáis la charla de Jeff Minter, por los dioses del Olimpo de los videojuegos…

(Ah, y si os aburrís, hay un episodio de un cierto podcast sobre la jornada que contiene diez minutillos de alguien hablando con la leyenda ☺️.)

Byte, septiembre del 85. Diez años de Byte

Portada de Byte de septiembre de 1985. Un tema de portada es el 'homebrewing', el otro, el supersistema de Ciarcia, con un ordenador compatible con Z80 a 6 megahercios y con 256 kilobytes de RAM

Pues vamos allá con el número de septiembre del 85 de Byte, el del décimo aniversario de la revista… En portada, un ordenador, pero uno construido por uno de los autores estrella de la revista, Steve Ciarcia, que se sacaba de la manga un ordenador de 8 bits para la era de los 16, con todo lujo de esquemas para que te lo montaras tú mismo:

BUILD THE SB180 SINGLE-BOARD COMPUTER

PART 1: THE HARDWARE 

by Steve Ciarcia

This computer reasserts 8-bit computing in a 16-bit world

Newer, faster, better. These words are screamed at you in ads and reviews of virtually every new computer that comes to market. Unfortunately, many of the proponents of this rhetoric are going on hearsay evidence. While advertising hype has its place in our culture, a more thorough investigation may lead you to alternative conclusions.

Generally speaking, quotes of increased performance are basically comparisons of CPU (central processing unit) instruction times rarely involving the operating system. The 68000 is indeed a more capable processor than the 6502, but that doesn't necessarily mean that commercial application programs always run faster because the CPU has more capability. People owning 128K-byte Macintoshes have discovered this.

The bus size of the processor is only one factor in the performance of a computer system. Operating-system design and programming styles contribute much more to the overall throughput of a computer. It is not enough to simply compare 8 to 16 bits or 16 to 32 bits. For example, the Sieve of Eratosthenes prime-number benchmark runs faster in BASIC on the 8-bit 8052-based controller board presented in last month's

Circuit Cellar than it does on a 16-bit IBM PC.

Y por si esto no fuera suficiente para la sección «cosas que no veríamos en una revista generalista de informática hoy»…

AN ANALYSIS OF SORTS

by Jonathan Amsterdam

How to choose one sorting algorithm over another

A friend told me recently that 90 percent of all the computer programs in the world sort. I can believe it. Our society's passion for organization has elevated the simple task of putting things in order to a position of major importance. And who better to carry out the job than those informational beasts of burden— computers?

Because of their significance, sorting algorithms have been thoroughly studied. Some are slow and some are fast. Some sort a few items and some sort millions of items. Here I want to discuss sorting in the context of three different algorithms: Selection Sort, for small lists. Quicksort, for larger lists, and Mergesort, for lists of a size so monstrous they can't fit into memory all at once. But first we will need to develop some simple tools to help us with our analysis of these algorithms.

Analysis

Our goal is to understand the efficiency of some sorting algorithms. But we are immediately faced with a problem: How can we study an algorithm in the abstract without considering the language it's written in or the machine it's running on? For example, any algorithm written in a high-level language will run faster when written in

assembly language. And any program running on a microcomputer would run faster on a mainframe. We want to abstract away from these facts, to talk about an algorithm's running time independent of machine or language.

Efectivamente: una discusión lo suficientemente sesuda como para una asignatura de Algoritmos de primero de carrera sobre los diferentes algoritmos de ordenación (temazo de lectura siempre muy recomendable), con sus grafiquitas sobre complejidades de tipo lineal, cuadrática y «n log n», algoritmos no tan básicos…

Figure 1: the rates of grouth of n, n log n and n squared

Listing 1: The algorithm for Selection Sort.

Selection Sort.

Input: an array, A, and its size, n.
Output: the same array A, in sorted order, 
begin for i : = 1 to n do begin 
m : = i;
for j : = i + 1 to n do 
compare A[j] to A[m], making j the new m if it is less; 
swap A[i] and A[mj; 
end 
end.

…el merge sort (o ordenamiento por mezcla),

Figure 2. The mergesort treeListing 2. The algorithm for Mergesort.

Mergesort.

Input: a list, L. 
Output: a sorted list, S. 
begin 
If L is one item long, then S = L
Otherwise, split L into two lists L1 and L2, each about half as big. 
Mergesort L1 into S1. 
Mergesort L2 into S2. 
merge S1 and S2 into S. 
end.

…o el mismísimo Quicksort:

Listing 3: The algorithm for Quicksort.

Quicksort.

Input: an array A, with items from 1 to n.
Output; the same array, sorted, 
begin
choose a pivot; 
partition the list so that all items < = pivot are < i; 
Quicksort A from 1 to i - 1 ; 
Quicksort A from i to n; 
end.

Pero la gracia de este número era la celebración del primer decenio de vida de la revista, y el consiguiente echar la vista atrás, entrevistas incluidas con el ya citado Ciarcia o Jerry Pournelle (de quin hablamos en el número de julio).

NO CELEBRATION of BYTE's 10th anniversary would be complete without the acknowledgment of some of the events and contributions that helped to shape the magazine. In the too-few pages that follow, we tried to capture some of the flavor of the past 10 years.

Special thanks to all contributors and to the BYTE staff, especially Gregg Williams, who chaired the project, Richard Shuford, Rich Malloy, Mark Welch, and Stan Wszola.

A Microcomputing Timeline
Notable Quotes
Evolution of the Microprocessor
Interview: Carl Helmers
Interview: Steve Ciarcia
Ciarcia's Prodigious Output
Interview: Robert Tinney
Tinney Favorites
Interview: Jerry Pournelle

Vale la pena ir siguiendo los enlaces a la revista que dejo en cada imagen, aunque solo sea para disfrutar de las maravillas del diseño industrial de la segunda mitad de la década de los setenta y la primera mitad de la de los ochenta…

Fotos de los ordenadores Sphere 1, Kim-1 y el Altair 8800, todos ellos de de 1975

…como el Sphere 1, el Kim-1 o el mitiquísimo Altair 8800 por ejemplo.

Recuperando temas que a veces no recordamos que vienen de muy lejos, los teclados:

Que el Keyport 717 tiene bien poco que envidiarle al más loco de los teclados actuales.

En la Kernel del mes nos encontramos con el lanzamiento de Excel en una conferencia conjunta de Microsoft y Apple porque, como igual no sabías, Excel era originalmente una aplicación para el Mac.

Kernel

The ongoing construction work at Chaos Manor made it desirable for Jerry to escape yet again. He attended a joint press conference held by Microsoft and Apple in New York. The product introduced at the conference, Excel, is a spreadsheet for the Macintosh. Comments made at the press conference caused Jerry to put down some thoughts on software integration and whether or not we need it. He also looked at several new products, including a new version of BASIC from the inventors of the language.

This being our anniversary issue, Dick Pountain brings us a condensed history of personal computing in Great Britain. He also introduces us to a rugged new lap-held portable, the Husky Hunter.

From Japan, Bill Raike sends us an abbreviated history of that country's microcomputers and also discusses an innovative new product from Brother Industries— the SV-2000 Software Vending System.

In this month's According to Webster, Bruce describes his experiences at the West Coast Computer Faire. He discovered that it isn't as much fun as it used to be, but he found some interesting products on display. He also discusses Apple's plans for the Macintosh, predicts success for the Amiga, and looks forward to testing a host of new products.

Bob Kurosaka discusses the world of transcendental numbers in Mathematical Recreations. Some of them are familiar to us, such as e, the base of natural logarithms, and ir. He looks at some hiding places for these two numbers and some ways to approximate their values.

Y no podíamos saltarnos, claro está, sobre el textito que le dedica Pournelle al Amiga, por el que vota como sucesor del Apple II a finales de los ochenta. Ojalá, Jerry. Ojalá.

Amiga

Among its other faults, Apple has been shamelessly neglecting the Apple II family, and specifically the Apple IIe. When the IIc came out a year ago, Apple cut the price of the IIe and slowed production, figuring the machine would die of its own accord. Instead, the sales jumped dramatically, easily outselling the IIe. People would see the IIc ads, come into the computer store, and walk out with a IIe. Why? Because the IIe had slots, while the IIc (like the Mac) was a closed machine. The IIe is a chameleon: With the right set of boards, you can make it look like and do just about anything. Case in point: The nicest development system I've ever used, including mainframes and minis, was an Apple IIe with 128K bytes of RAM, an AcceleratorIIe card (3.5-megahertz 65C02), and two Axlon 320K-byte RAM disks (configured as four 160K-byte floppy disks). Apple's response to the increased IIe sales was to cut back on production and raise its price (while discounting the IIc). Even so, it wasn't until late 1984 that the IIc finally started outselling the IIe.

What does this have to do with the Amiga? Well, several machines are competing in the low-end market: the Atari 520ST the Apple IIe, the Mac (to a lesser extent), and the Amiga. Guess how many of these are easily expandable? Just one: the Amiga. Guess which machine will probably end up being the Apple II of the late eighties? I don't think the IIe will, nor the Mac, and the ST is a tightly closed, nonexpandable box. My vote is for the Amiga. From what I can see, the Amiga's graphics, sound, 68000 processor, memory map (allowing up to 8 megabytes of RAM), and expansion bus give it the potential of a long and successful life. There's always the chance that Apple will, indeed, come out with a souped-up Apple II next year, but even with the Western Design Center chips (65816. etc.) and the nifty 3 /2-inch Duodisk (1.6 megabytes of storage), it will probably be too little, too late.
(Dejo la imagen enlazada a la versión grande de la imagen, y no a la fuente en el Archive (aunque siempre tenéis la opción de leer el texto alternativo de la imagen).)

Y cierro con dos piezas más. La primera, lo normal en las revistas actuales (no): el típico artículo de dedicado a los números π y e…

pi, e, and All That

Sneaking up on transcendental numbers

by Robert T. Kurosaka

God made integers, all else is the work of man.
— Kronecker

This famous quote of Leopold Kronecker serves as the starting point for this month's column. The integers (the whole numbers) can be used to construct other numbers.

We can construct rational numbers by dividing one integer by another. When we do so, we get either a terminating decimal (1/4=0.25) or a nonterminating, repeating decimal (7/18 = 0.388888 ...). Repeating, or cyclic, decimals are a fascinating study I may explore in a future column.

Irrational Numbers

We can also construct numbers that are both nonterminating and nonrepeating. It is a rather amazing notion that a string of digits may go on forever without having to establish a pattern. It's such an odd notion that the ancient Greeks originally did not believe it possible— or even imaginable. When it was established that the square root of 2 was such a number, the Greeks called this kind of number irrational. The root meaning of irrational is "without ratio,'' or unable to be expressed as a fraction. The Greeks found such numbers irrational not only in the sense of "non-ratio-able" but also in the sense of "nonsensical."

The differences between rational and irrational numbers are substantial. It can be shown that no more rational numbers exist than do whole numbers, but irrational numbers outnumber rational numbers. This fact, which is often presented as a paradox, is not especially surprising when you look at how we have constructed rational numbers. They are built up out of whole numbers and can be expressed as integer fractions. As I said above, irrational numbers cannot be so expressed.

TWO TYPES

There are two different kinds of irrational numbers. The first, like the square root of...

…y el segundo dedicado al Versabraille II, un ordenador diseñado para funcionar usando braille, porque la preocupación por la accesibilidad tampoco es nueva:

VersaBraille II

Telesensory Systems has introduced the VersaBraille II system, a portable, disk-based electronic information processor for the blind. This braille computer lets you electronically store, process, and retrieve information. A special telephone modem can link VersaBraille II to other computers.

VersaBraille II consists of a standard 3 '/2-inch microfloppy-disk system and a braille display that substitutes for a video monitor. Its memory holds up to 30.000 characters; disk support boosts the unit's capacity to 77,000 characters. This is adequate for many word-processing procedures, such as formatting, high-speed searching, and inserting, deleting, and relocating text. The system can simultaneously output braille and print information.

VersaBraille II is fully programmable. Menus guide the user to each of the system's programs. The manufacturer provides special software that converts VersaBraille II into a four-function calculator with algebraic logic, floating decimal point, square root, and percent. Plans for other software packages include a 50,000word spelling checker, a two-way braille translator. and a language interpreter.

The price of a VersaBraille II system is S6995 plus shipping and handling.

He encontrado poca información sobre el Versabraille II, pero si alguien quiere investigar sobre su antecesor, el Versabraille original, aquí un documento por el que comenzar.

Apa. Volvemos el mes que viene con el número de octubre. Por cierto, que si alguien quiere hacer los deberes por su cuenta, además del archivo de la revista en el Archive, también tenéis esta chulada de navegador que me pasó hermanito hace unas semanas.

¡Hasta la próxima!

Chat Control, enésimo intento

Recojo aquí mis notas de leer Chat Control Must Be Stopped, Act Now! Por poco que leas inglés, mejor acceder a la fuente que quedarte con mi resumen. Y si quieres tomarte la molestia de contactar con tus representantes de la UE, aquí tienes una herramienta.

Qué es Chat Control

Una propuesta de legislación europea que pone en riesgo la privacidad de todos en nombre de una presunta protección de los derechos de la infancia (algo que nos preocupa a todos, naturalmente) que va a resultar, en el mejor de los casos, muy poco efectiva. La cosa es especialmente importante ahora mismo, porque este viernes (doce de septiembre) se «pasa de fase» en el proceso, para llegar a un voto que tendrá lugar el catorce de octubre (y, al menos de momento, España se muestra a favor de aprobar Chat Control).

Chat Control haría que todos los proveedores de mensajería (WhatsApp, Telegram y compañía), correo electrónico, redes sociales, hostings y un largo etcétera monitorizasen todas nuestras comunicaciones y ficheros a la caza de material en que se abusa de la infancia. Si se pone en marcha Chat Control, acabaremos con la protección de nuestras comunicaciones y daremos un paso adelante en la vigilancia masiva por parte de los estados, poniendo en riesgo las garantías de la democracia y pasando de regulaciones como la famosa RGPD.

Por qué no funcionaría Chat Control para proteger los derechos de la infancia

Como siempre, mejor ir a las fuentes que leer mi resumen: aquí tenéis el Joint statement on the future of the CSA Regulation de EDRI, una organización para la protección de los derechos humanos, con un montón de otras organizaciones dedicadas a los derechos digitales, los derechos humanos y, sobre todo, organizaciones dedicadas a la protección y los derechos de la infancia, que defienden que hay muchas otras iniciativas que serían mucho más efectivas, como invertir en trabajo social, dar más y mejor ayuda a las víctimas, dar líneas de contacto y soporte, trabajar en prevención y educación, dar más recursos a las fuerzas del orden y trabajar en la seguridad de las TIC. También se señala que hay informes que dicen que la tasa de error de las herramientas de monitorizado que se pondrían en marcha es mucho más alta de lo que uno pensaría, y que vamos a bloquear los servicios existentes con toneladas de falsos positivos, entre otros problemas, y a generar enormes bases de datos de información extremadamente sensible que van a ser una diana jugosísima para muchísimo cibercriminal a la caza de personas a las que extorsionar (sin hablar de los malos usos que puedan hacer de esas bases de datos personas que están dentro de las organizaciones que administren esas bases de datos, claro).

(Insisto de nuevo: mucho mejor leer las fuentes que mi resumen.)

Qué puedo hacer

Ruido, básicamente. Afortunadamente, no será la primera vez que el ruido para propuestas. Como decía antes, esta herramienta ayuda a redactar un mensaje que enviar a tus representantes políticos en la UE. Y dar difusión al tema también ayuda. Seguramente poco, pero ayuda. Después, tener estas cosas en cuenta a la hora de votar, pues igual también es una idea…

En fin. Crucemos los dedos.

Lecturas (2025.II)

Se me acaba el verano (astronómico, que el meteorológico ya se fue), que quiere decir que se va a frenar (aún más, sí) mi ritmo de lectura, o sea que dejo aquí lo que hemos avanzado desde la última vez.

Te pones a leer el segundo (y último, lamentablemente) de la serie de Dirk Gently de Douglas Adams, al cabo de pocas páginas te das cuenta de que ya te lo habías leído, pero Adams escribe tan bien que te lo vuelves a leer, porque a veces es necesario reírse un poco. Extremadamente recomendable. Como la primera temporada de la serie homónima de Netflix, si no la habéis visto (pero negaos en redondo a ver la segunda).

El año pasado me leí (véase) The Maniac, el segundo libro de Banejamín Labatut, me encantó, y ahora me he puesto con el primero, Un verdor terrible… y me plantea dudas. Es tan fácil y atractivo de leer como el otro, y sigue básicamente el mismo esquema: ilustrar con dramatizaciones de hechos reales la fascinación del autor por el genio científico del carácter más dramático y patológico. Si The Maniac se basaba en el genio perverso de John Von Neuman, aquí el núcleo principal es la historia de Erwin Schrödinger (enlazo a la versión inglesa de su artículo en Wikipedia porque la versión en español se salta partes notables, y horribles, de su historia). Y la cosa es que no sé yo si los libros de Labatut acaban romantizando esa figura del genio loco y justificando sus perversiones más allá de lo que sería recomendable :-S. En cualquier caso, que Labatut escribe como los ángeles (a veces caídos), queda más allá de cualquier duda.

Juan José Millás es un crack, eso es un hecho. Este librito (apenas 112 páginas, suficientes para ganar el Nadal de 1990) comienza quizás «poco Millás», pero al poco de empezar la narración del proceso de alienación de la protagonista, las cosas alcanzan la altura y velocidad de crucero y uno no puede soltar el libro hasta el final. Cinco estrellas, o las que haga falta.

Seguimos con la sana costumbre de leer libros de divulgación sobre lingüística, este centrado en los efectos «del interné» sobre el lenguaje (quizás el principal de ellos, dar muchísimo juego a los lingüistas). La autora es, por cierto, una de las cocreadoras del podcast Lingthusiasm. Debo confesar que, estando el libro muy bien escrito y encontrándolo bastante recomendable, no me ha interesado muchísimo en general, probablemente porque está hablando «de mí» y la música me sonaba demasiado. Y aun así, ahora tengo muchísima curiosidad por leer la traducción adaptación al español del libro, Arroba lengua, porque ha tenido que ser un esfuerzo descomunal adaptar un libro así.

Y nos vamos con una novela gráfica un tebeo en formato largo. A Zerocalcare, confieso, lo descubrí cuando tenía Netflix porque me llamó la atención el amarillo Simpsons de sus personajes en la serie Cortar por la línea de puntos (veo que en 2023 sacaron otra serie, Este mundo no me hará mala persona). Maravilloso accidente, porque es un narrador extraordinario, y poca gente explica cómo el paso de la niñez a la adolescencia a la presunta adultez (en masculino, que lo suyo es la primera persona, pero creo que su visión es bastante razonable desde el punto de vista del género, aun sin huir de ese punto de vista masculino). Muy recomendable.

En fin. Al ritmo que vamos, no sé yo si me dará para hacer otro «lecturas» antes de que se acabe el año. Se intentará.