Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why the Apple II Didn’t Support Lowercase Letters (vintagecomputing.com)
170 points by freediver on Sept 10, 2020 | hide | past | favorite | 139 comments


Woz explains why the original Apple II (1977) doesn't support lowercase letters. That's not surprising, in retrospect; neither of its major contemporaries, the Commodore PET 2001 nor the TRS-80 Model I, does either, despite being developed by major corporations with substantial resources.

He doesn't explain why the Apple II+ (1979)—after the II's market success was proven—doesn't support lowercase letters. Even if software uses graphics mode to display lowercase letters, the II and II+'s keyboard does not have physical/electrical support for detecting shifted letters. Since graphics mode is cumbersome and slow, word processors for the II and II+ typically use reverse video to indicate capital letters, and use another key like Escape as a shift toggle. A popular alternative is the shift-key mod that fattire mentioned, which requires soldering of a wire to one of the paddle ports.

The lack of support is because the company was working on the Apple III (1980), which it expected would quickly obsolete the II series. The III has built-in 80-column text and full lowercase support, at both the character-font and physical-keyboard levels. Apple had incentive to not make the II too attractive.

Neither Woz nor anyone else at Apple expected that a) the III would quickly fail, and b) the II series would remain Apple's bread and butter. Without the III's distraction the II+ would surely have had built-in lowercase software and hardware support, or there would have been another II around 1981 with such. As it were, the III took up so much of Apple's resources that the Apple IIe did not appear until 1983, by which time the IBM PC had surpassed the II series.


I've always wanted to know why Apple II video screen is only 280 pixels wide.

The Apple II video screen is 40 bytes wide, but it only draw 7 pixels per byte, for a total of 280 pixels. The 8th bit was used for selecting the color pallet (eventually; initially the 8th bit was just ignored)

Drawing 7 pixels per byte makes everything a lot more complicated, so it would seem to go against Woz's obsession for simplicity. It's hard to believe that 7 pixels per byte would reduce the chip count, and it certainly makes the software a lot more complicated (lots of divide-by-sevens).

CGA graphics and the Atari 400/800 also provide a composite graphics mode similar to the Apple II, where each pixel lines up with the NTSC color burst frequency, but they draw 8 pixels per byte for a total of 320 pixels, and therefore use a wider portion of the screen.

Perhaps Woz thought (incorrectly) that the NTSC screen wasn't wide enough to accommodate 40x8 (320) pixels so he cut it down to 40x7, or he planned all along to use the 8th bit to provide extra color. It would be interesting to know the actual reason.


For text, a 5x7 dot matrix font fits nicely in 7 (wide) x 8 (high). 8-pixel wide characters look funny: characters have to be an odd number of pixels to get symmetrical letters like A, so you either have 1 or 3 pixels spacing between.

While NTSC allows 320 pixels, typical analog TVs overscanned a lot and you'd miss the right and left column or 2 of text.


Yup. In the early days one of the first things I did to the TV's I setup for myself and others as monitors was to do a full CRT setup to put the entire safe area on screen, converge, color drive, etc...

The Apple always fit. Atari machines often didn't.

On my personal "monitor", I scored a finer dot pitch CRT and did a lot of work to put the entire frame on screen.

Atari did allow wide screen DMA which used up nearly the entire active display area.

I would run a 384x224 display on it sometimes.

And after a full alignment, better CRT, one could read 80 columns on the set and watch color programs look very good.


I am gonna ramble a bit because I really like the era and CRT displays.

... a lot of the picture quality impact the pro grade CRT TV displays had was due to finer pitch CRT's.

The circuits matter too. Don't get me wrong.

Even without a spiffy fine pitch CRT, a well calibrated 80's era TV (some 70's era ones too), worked well as a monitor.

I had worked at a TV repair shop as a younger kid. The Atari machine I had was just good enough to generate respectable alignment and calibration signals. If you were willing to put in a good day, many sets could perform far better than they were just sitting there on, showing slowly degrading TV programs.

But, they tended to age and degrade a lot more quickly than we are used to today. I did my special one three times over the course of 5 years or so to keep it at peak.

That's another difference in the pro grade gear. It stays on point for ages.

With Apple setups, one could deliberately calibrate to maximize the essentially monochrome screen, and expand it to take up a larger area to get a little bit more benefit from the often coarse shadow mask on color sets.

Many people would just run both. Monochrome for text, some color TV for graphics.

A workaround for color sets was to run saturation a bit low and make sure the color killer circuit was hair trigger so it worked when the Apple requested no color.

Moderate contrast, bringing up the screen levels while at the same time lowering the drivers would sharpen things up considerably. End result was a decent resolution and that "don't use it in a brightly lit room" tradeoff.

Bonus points for tapping into composite, avoiding RF. Where I lived a lot of people had UHF modulator on their Apple computers. These worked on something like channel 34 instead of the more interference prone channel 2, 3 or 4.

The real fun began when VCR's got the scene. Hit record, set for SLP recording and you got an up to 8 hour record of what you did.

Or, run the VCR faster and get a higher quality record for show and tell or debugging.

If you crave a CRT today, obviously the pro grade displays on their way out are great. You will likely be impressed when watching something off DVD, or Bluray. Standard definition was much better than most of us saw due to the tradeoffs made in mass market displays.

But, those are harder amd harder to get.

The next best thing is a newer CRT. Clean and recap that if needed and then go through the alignment and calibration using one of the many DVD images available for that purpose.

Many of those have great filters that can do a lot for Apple 2 like signals.

Then enjoy the artifact art in all its glory!

http://www.appleoldies.ca/bmp2dhr/disclaimer/

The images on this page use the better "Double High Res" artifact color possible on the Apple //e

There are some examples of Apple 2, the original graphics being discussed here, on that page.

Many artists used patterns to exploit secondary artifact effects (best seen on an NTSC TV with color saturation turned up a bit), and our own color perception to produce images that appear more colorful than the basic 6 color design one would expect to deliver.

This pixel art can be seen all throughout the 8 bit era running from the late 70's to mid to maybe late 90's.

https://images.app.goo.gl/1wzZhp76tsC3hC6G6

In that one, only one color attribute set was used. Black, white, orange and blue. The secondary artifacts shows nicely here.

A single orange pixel by itself on black delivers a little bit different overall image color when nestled in all white pixels.

Dithering coupled with these subtle effects could really change the overall viewer perception!

In addition, due to how the NTSC color worked, intensity and color got coupled together, giving pixels a sort of texture and subtle position difference when viewed on a color display. On the Apple, this can be seen as a smudge of color around the pixel. You can see this in the lower resolution graphics where the pixels literally don't quite line up.

Many more and better can be seen on the Total Replay image being developed right now.

Emulators have modeled what a real display would do very well. You get a whole lot of the experience that way.


That seems like a big price to pay for slightly better text spacing. Virtually every other home computer used 6 or 8 pixels per char/byte.

I don't think overscan would be an issue. The TMS9918 is also (the equivalent of) 320 pixels wide, and it was used in many different home computers and game consoles intended to connect to the family TV. There's no overscan issue, unless it's connected to an Apple II monitor.

When the TMS9918 is connected to an Apple II monitor there is a huge overscan issue, since the monitor is tuned for the narrower Apple II screen. It looks like this: https://imgur.com/rGcRpw0

That's what got me thinking about this issue is the first place.


OK, but the Apple ][ was sold before special computer monitors were common. It was meant to hook up to your existing home TV. So what mattered at the time was the way most people's TVs were calibrated. They usually had a lot of overscan, because (a) 1970s picture tubes weren't square, they had a lot of rounding at the corners (b) the (all-analog) deflection & blanking circuits made fairly ugly artifacts at the edges which TV makers tried to hide.

If they'd shipped with 320-wide video, a lot of people wouldn't have seen the prompt on the left hand side.


The Atari 400/800 and the TMS9918 (TI 99/4A, Colecovision, etc.) are 320 pixels wide and were designed to run on those same 1970s home television sets, and they didn't have an overscan problem (maybe some sets did, but most didn't). I don't think the Apple II had to cut the screen width down to 280 pixels.

That's the heart of the question. Did Woz think that 320 pixels would not fit, or did he cut the width down for another reason, like more color.


Or, perhaps, he didn't want to pay for a video generator IC like a 9918 or 6845.

If you look at the video generation circuit on an Apple ][ it's all done in real-time in TTL, alongside the DRAM refresh activity. And it's all crammed into the normally unused tock of the 6502, making this work with virtually zero overhead on the CPU. That's the reason why the memory mapping of the screen isn't linear to the physical display. The software tradeoff was cheaper than the hardware one.

http://twimgs.com/informationweek/byte/archive/Apple-II-Desc...


And it made add on cards simpler, as well as cycle counting. In addition, the overall throughput of the CPU is better than most other machines, giving the Apple an effectively higher speed at that clock rate.

Overall, given memory expansion, the goofy screen didn't end up being too big of a deal. Most programmers made Y axis lookup tables and called it a day. The fact that the artifact color mapping repeated every word, two bytes, did have an impact in that it was generally faster to maintain two pre-shifted copies of software sprites though.


Speaking of add on cards, I own a TMS9918 video card for my Apple II. It's remarkably simple. The 9918 manages its own 16 kB of VRAM and it doesn't place any timing restriction on the host system.

You'd think that the 9918 would be a bottle neck between the CPU and the VRAM, but the Apple II can write to the 9918 VRAM as fast as its own internal RAM. Faster, actually, since the destination address auto-increments. That was a surprise.


Interesting!

Hmmm, is the bitmap linear by line, or C64 style?

Auto increment + C64 style would rock pretty hard. It's still good per line.

Both have their merits.


Bitmap mode is made up of 8x8 tiles. The screen is 32x24 tiles, or 256x192 pixels. 2 colors per line per tile. So a single tile can contain up to 16 colors.

A single screen is divided into 3 separate tile maps, each 256 bytes long.

It also supports 32 spirtes but only 4 sprites per scanline, which is its biggest limitation.

The Sega Master System used an upgraded version of the 9918, adding 64 colors, smooth scrolling, and 8 sprites per scanline, all within the same 16 kB of VRAM.


So if a person maps them up, one gets basically C64 style display.

Yeah, auto-increment can rock! Blasting characters to the memory would be fast, and only require a precise index for the source data. Perfect for the 6502. Other index can be done every so often, depending on what is being drawn.


Right, but the Apple II video generator is already reading 40 bytes per scan line, and is using all 8 bits of each byte to generate the video signal. I would think that drawing 8 pixels per byte instead of 7 would have been the simpler thing to implement.


Actually, it's not.

In the Apple 1, only 7 bits were output. Expanding that circuit to do graphics would take a byte and still output just 7 bits, simple.

People are pretty sure the Apple 1 video circuit was just expanded to do graphics, which is why it's 7, and that left the high bit for a color shift.


Err, as an Atari 400/800 owner, I can assure you that they did in fact have an overscan problem. Atari BASIC even defaulted to not printing to the first 2 characters and last 2 characters of each line.

Here's an example Atari BASIC program that uses POKE statements to change and then restore the default margins.

https://www.atariarchives.org/c2bag/page003.php


Those margins make almost exactly the Apple display region active text.

Interesting! I never noted that before.


Tangent: when we bought our Apple II in 1980, the salesperson explained that the "Sup'R'Mod" extra video box we also needed to connect to a TV was sold separately, manufactured by a different company, so that neither needed FCC certification.

I don't know whether this story is true, but it's what we were told.


Wikipedia has a page on the Sup'R'Mod and cites this page for the same story with some additional detail: https://apple2history.org/history/ah03/ (see "Other Design Features" section)

That's a funny little piece of computer history.


That's also how it was explained to me, back in the days.


Actually, the TMS9918 has a 240x192 text mode with 40x24 characters of 6x8 each and several 256x192 graphics modes with 32x24 characters of 8x8 each. This is always fewer than Apple II's 280 pixels per line.


Right, but in text mode the 9918 draws 6 native pixels for every 8 NTSC color "pixels". So, measured in NTSC color pixels, it's 320 wide (40x8) compared to 280 (40x7) for the Apple II.

This image (white text on a blue background) shows how the 9918's native pixels line up with the NTSC color burst frequency: https://imgur.com/rGcRpw0


And a 7x7 dot matrix font fits nicely into 8 (wide) x 8 (high)-- this is what the original IBM PC did. So it doesn't really make a ton of sense to go out of your way to make characters seven pixels wide unless you're anticipating that you're going to use that eighth bit for something else later (or you're really determined to have two pixels between each character rather than one for some reason, I suppose).


I am inclined to think he was thinking about color.

The phase shift makes a big difference.

Having 6 colors and the artifact art variations was enough to do pretty much anything, not always well.

Had it been 4 color, it all would have been far more limited.

Frankly, the screen being good enough may be part of why graphics adapters never became common.

I always thought they should have.

Once accelerators took the 6502/816 to 4Mhz plus, a graphics adapter, perhaps combined with a faster CPU would have made a ton of sense.

The power of the default was stronger due to that 6 color capability, IMHO.

Compare 4 color CGA and it is notable.

That said, CGA over NTSC was Apple like with 16 colors, no attribute clashing. That is what an expansion card for the Apple 8 bitters should have done.


Don't forget that the Apple I was basically a glorified video terminal, so the text mode came first. Also in the Apple II. That's the reason there are 7 pixels horizontally per character.

The graphics mode is just a juiced up text mode, so the 7 pixels were kept.

And for the video shift register, it's no difference if the input comes from the character ROM, or video RAM.


That's another thing I've wondered about. Did the Apple I also output 7 pixels per character? Since it doesn't do color and doesn't care about the color burst frequency, it's free to use any size of pixel it wants. With a 5x7 font, I would have guessed that 6 pixels per char would be simpler.


The Apple II graphics resolution modes could be mixed with text. So that is to say, in either the lo-res or hi-res modes, a hardware register setting would conceal a bottom portion of the graphics frame buffer, revealing four lines of the text mode.

I think all the circuitry for shifting out the pixels must have been very minimal and shared between the text and hi-res mode.

So, .. if you recall, the character cells in the 40 column text mode are all also 7 pixels wide. The character cells were 7x8. In that space, a certain font fit exactly, with a two pixel space between characters. Here is A:

   OOO   |
  O   O  |
  O   O  |
  OOOOO  |
  O   O  |
  O   O  |
  O   O  |
         |
  -------+
In high res graphics mode, you could create a bold font, by using the extra column:

   OOOO  |
  OO  OO |
  OO  OO |
  OOOOOO |
  OO  OO |
  OO  OO |
  OO  OO |
         |
  -------+
That was seen in a lot of games.

Anyway, the high res graphics mode pixels coincided exactly with the text mode pixels. You could imitate the text mode using high res graphics, and it would look indistinguishable. (Except for not being able to make the characters blink simultaneously, though even that could be sort of faked with page flipping.)

I think that in a 8x8 character cell that would result from a 160x192 mode (40x24 text), it might have been more awkward to design the font. You want an odd number of pixels, for symmetry, so probably characters will be 7 wide. 5 would leave 3 pixels of space, which might be too much. For designing 7 pixel font glyphs, maybe 7 scan lines is not enough height.

It's possible that part of the decision was around the design of the text mode and its font, and the high res graphics was tied to that, pixel for pixel.


The 8th bit was never ignored, it was for phase, and for example determined if you saw green or orange. Probably the best ref for this is from '83, Gayler's "The Apple II Circuit Description."

edit: corrected below, rev 0 non RFI (without the aux vid pin) ignored DL7


Actually, the very first Apple IIs only supported 4 colors in hi-res mode. The 8th bit was ignored.

A later revision added support for 6 colors, using the 8th bit to shift the pixels and change the pallet.


Thank, I just consulted appendix B of my copy, you are correct. DL7 is unconnected in nonRFI rev0 though every later Apple II RFI or not could do 6 color HIRES.


There is an upside to that madness, and that is the Apple had a 6 color high res graphics screen, and when only one pixel is used, it actually has a close to square ratio.

Most 2bpp displays of that era made wider than tall pixels.

The result, given creative use of patterns was a high res display that could pretty much do anything. Maybe not that well, depending, but 6 colors per line makes differentiating objects clear.

4 is not quite enough.

I would like to know as well.


> The 8th bit was used for selecting the color pallet (eventually; initially the 8th bit was just ignored)

The palette selection was achieved by...

(Yes, it's that cool)

Shifting the 7-pixel group half-pixel to the left (IIRC, could be to the right). It was a trick used in HP terminals (Woz worked for HP) and printers to have nicer fonts without doubling the pixel clock and character ROM sizes.

For color, this put the pixels at offset positions in relation to the chroma wheel and that caused the switch from green/purple to orange/blue.


Woz loved these kinds of optimizations.

With the Apple 1 he also used 7 bits per character, but he stored them with 7 1024-bit shift registers.


I've read that the 7-bit horiz pixels was a special Woz hack to save on cost. In 'fully'-addressable mode the nearby pixel 'swim about' as pixels are set/cleared in the vicinity.


Some of the Apple clones, such as the Franklin ACE models and others, offered lowercase.

Do a quick search for "apple II shift key mod" and you'll find vintage PDF instructions for adding it yourself..

https://archive.org/stream/II_II-Shift-Key_Modification/II_I...


As a math grad student in 1980, I took out a student loan for $3,000 (equal to my annual stipend) to buy an Apple II computer. (Eventually, computation established my career, just not on this machine.)

I recall making a "shift key mod" within days, that no doubt voided my warranty. It wasn't the mod described in this article. I recall some card that gave me 80 columns, Pascal, and increased my memory from 48K to 64K. I believe that my shift key mod involved cutting a single trace? In any case it worked. Various out-of-school friends learned computers on this machine, and changed careers. I learned the low memory locations like the back of my hand, and wrote crude Pascal programs, while my "real" code was in the relatively young C language on a Unix timesharing machine. That all changed when I bought one of the first Mac 128K's. Manx Aztec C!


I did that on my German Apple ][ Europlus and also got a re-burned EEProm to display the lower case. I like the original Woz story but it seems strange that the later models didn't enable lowercase.


Yeah the Franklin Ace was mostly an Apple ][+ clone but had lower case (but no 80 columns). It's kind of a miracle so much software worked with that.


Videx lower case chip to the rescue:

http://mirrors.apple2.org.za/ftp.apple.asimov.net/documentat...

Another amazing bit of kit by Videx was a replacement keyboard controller called the Enhancer II:

https://archive.org/details/Videx_Enhancer_II_Installation_a...

The Enhancer II included the lower case chip.

Videx also made an 80 column peripheral card for the Apple II. The company is still around:

https://videx.com/


Didn't the Apple II feature lowercase by the time of the IIe? My Brazilian clone of the II+ included the lowercase mod stock (even including Brazilian accented characters such as ç and ã).


It did, the upgrade for the Apple iie was the 80 column card which also gave 128k memory (from 64k). That was built in to the apple iic.


> In the early 1970s, I was very poor, living paycheck to paycheck.

I'm just curious, was an engineering job at HP in the 1970s not well paid?


Define "well paid". Entry-level (relative) wealth is a recent thing.


Remember, he didn't have a degree until 1987. And credentials mattered more back then.

When I graduated with a B.S. in 1982 I had 4 offers, ranging from $22K to $28K. It was enough for the two of us -- our apt was $275/month. But there wasn't a lot left over.


Reading Woz's talking about hand-assembling his 6502 code reminded me of my own practice which was the same. I would write out programs on graph paper with columns for the physical address, any labels and space for what the code would assemble to to the left of my actual assembler code. I would drop down to the monitor to hand-enter the hex code I would hand-generate. When you're working under constraints like this, you tend to be very careful in your coding. There's a part of me that still thinks in terms of the Apple ][ architecture when thinking about how a program works.


Same here.

Early on, I didn't have a machine of my own and would hand assemble programs to be typed in next time I could get some machine time.


I found one of my notes that I wrote code for manual assembly. It was the only way because I didn’t have an assembler. https://www.quora.com/How-did-you-learn-an-assembly-language...


  FD7E: C9 E0     CAPTST   CMP   #$E0
  FD80: 90 02              BCC   ADDINP     ;CONVERT TO CAPS
  FD82: 29 DF              AND   #$DF


Did the Woz read the TV Typewriter article?

https://en.wikipedia.org/wiki/TV_Typewriter

I bought Don Lancaster's book from Radio Shack long ago...

Also: what new technology are people desperately trying to get access to? I need to start the new Apple..


I asked Woz about the TV Typewriter before. In an email to me once, he said he had not heard about it at the time.

https://twitter.com/benjedwards/status/1303784698662055936?s...


When I was a kid, I used to buy EEPROMs at the local electronics store and burn them with the Apple ][ lower-case characters. I sold them in my dad's computer store, along with a wire that would connect the shift key to the last paddle button on the game port. It was a common hack, and some software would actually recognize it and properly display upper/lower characters.


I don't know why but interest of computer community shifted from uppercase letters to lowercase letters over the course of time.


Lowercase is simply easier to read because of more diverse letter and word shapes.

https://ux.stackexchange.com/questions/72622/how-easy-to-rea...


This is not correct. The “word shape model” itself is actually probably wrong[0].

> The weakest evidence in support of word shape is that lowercase text is read faster than uppercase text. This is entirely a practice effect. Most readers spend the bulk of their time reading lowercase text and are therefore more proficient at it. When readers are forced to read large quantities of uppercase text, their reading speed will eventually increase to the rate of lowercase text. Even text oriented as if you were seeing it in a mirror will quickly increase in reading speed with practice (Kolers & Perkins, 1975).

[0] https://docs.microsoft.com/en-us/typography/develop/word-rec...


Jazz pianist here. I had a weekly gig for a couple of years where the singer would choose songs and put the chord chart – | Eb C-7 | F-7 Bb7 | etc. – on the grand piano so the bass player could read it, and it was upside down for me. After a while, it almost seemed easier for me to read chord charts upside down!


Even if this is true and it's just practice, any new programmer still comes with years of practice in reading mostly lowercased text and very little experience reading all-uppercase text. In a field where everything gets reinvented all the time this should have a big impact.


I would also say it's easier to read since, for the average 20 word 4.7 char sentence, about 99% of the characters that our eyes see, and we read, are all lowercase.

edit: as in, perceiving that 1% case will require more effort.


It might be true to some extent but I don't think it's a complete deal breaker either. Look at Cyrillic print fonts for instance, the lowercase is effectively what we would call smallcaps in the latin alphabet: здравствуйте ЗДРАВСТВУЙТЕ. The р and у do deep lower in lowercase, but that's about it.

I really think it's a matter of habit and tradition more than anything else.


I can recall when using Mixed case in your Hollerith Constants was considered a bit flash :-)

Hollerith constants where the way you did text to terminal IO in Early Fortran's.

For example "Press 1 to exit, 2 to edit settings, 3 to start run?" instead of just "?"


Well, Unix was typed in on terminals that only had upper-case fonts. So it all looked uppercase.

Then when we got terminals that had lowercase fonts too, the Unix source was surprise! all lowercase. Because that's how it was typed in and nobody noticed.


That's hilarious :)


Because Unix won. Other contemporary OSes, like IBM's VM on mainframes, DEC's VMS, Digital Research CP/M and even Microsoft's MS-DOS and so on all favored uppercase.


Many of those older systems relied on peripherals that only did upper case, such as Teletypes.


Some getty implementations support uppercase-only terminals - the special case-folding mode is enabled automatically by typing an all-uppercase username at login.


Wonder if it goes even further back to using morse code. As there was no 'case' in that just letters and numbers. Think the custom was to use upper case there. An interesting project to look into someday I guess.


In the history of typography, capital letters came first, and it still remains standard in a number of contexts to use all capitals, for instance, almost all text you see carved in stone is all capitals, and most newspaper headlines are in all capitals. For some words -- for instance, proper nouns/names -- all lowercase is literally wrong.

So to almost anyone, seeing a long block of text in all uppercase may look stodgy or ugly or formal, but all lowercase looks wrong or provocatively artsy, so if you have to choose one or the other, all capitals is kind of a no-brainer.


I used a similar point as an argument in primary school. They dinged me for sloppy cursive writing so I ended it right there.

All upper case, caps were just bigger than the others.

Stayed with that until I left High School. Was making some point or other, or was just a PITA.


At the time handwriting was being taught in school, elbow problems (complications from hemophilia) made handwriting quite painful, so I was using a (silent, electronic) typewriter for in-class assignments.

Needless to say, typing handwriting assignments does nothing to improve handwriting!

Nearly forty years later, I still haven't learned to write in cursive, and my lower-case printing is either illegible or painstakingly slow. So, with the exception of single-character mathematical variable names, upper-and-small-caps it is.


I think a lot of students who grew up with ballpoints reached a similar conclusion. I knew a few and from time to time have used the strategy when legibility is required.

Cursive was developed to work with traditional pens, and it really does not flow well on a ballpoint. But the school systems, of course, never made such a distinction.


Totally.

I remember just quitting. 26 alphas + numbers and punctuation, and that's gonna be IT.

One counter argument was legal signature requirements. Had a parent from city hall clear the law up for me.

Was loaded for bear, and my little school really opposed not writing cursive. I remember a few of us, almost all boys going down the road.

Interestingly quite a few girls ended up with a sort of print-script we see in cutsie fonts today. Went totally without comment!


"Cursive" means slanted. "Current" means joined-up (from the french "courant", literally "running"). Not all cursives are current - Chancery cursive, for instance - and most recent schoolroom current writing isn't cursive.


That may be what it means in a technical sense, but what most of us Americans were taught in elementary school was joined-up writing that was presented to us as "cursive," the only (and obviously much more sophisticated) alternative to "print".


That was my experience exactly.


Maybe because some international alphabets have letters that only exist as lower case. For example the german ß didn't have a capital form before 2008.


So you don't need to distinguish keywords from variables by case when you have syntax highlighting, and lowercase has been shown to be faster to comprehend than uppercase while driving[0], so switching that metadata signal to a secondary channel (color of the text) seems natural. colorForth[1] was made on a similar premise to exploit that channel, but as a read-write rather than just read channel.

[0]: https://trid.trb.org/view.aspx?id=475144

[1]: https://en.wikipedia.org/wiki/ColorForth


Word processing, perhaps? Wordstar/Wordperfect.


If you look at any normal written text, most of it is lowercase.


I think this is part of the reason why many systems in the past favored uppercase for programming languages: it’s easier to tell apart comments from code at a glance.


Many early character sets (CDC Display Code, UNIVAC's Fieldata) were only 6 bits wide, so there wasn't room for two cases of the letters until 8-bit EBCDIC and 7-bit ASCII came along.


For me, it’s mostly because uppercase letters either require toggling capslock or holding shift.


Later versions of the Apple II line did eventually support uppercase. I forget which had it first, but the IIgs certainly did.


The II+ was basically a II with support for lowercase and 80-column lines added by means of an expansion card.


Actually, the ][+ was still 40-column, uppercase. I remember wiring a pin off my keyboard controller card to the paddle button input and add a replacement character ROM to get lowercase support.

It was the //e and //c that had built-in lowercase support.


We had an Apple II+ with the limitations you describe. However, we purchased and installed the 80-column card that provided both 80-columns and lowercase characters, which is what I think the parent comment is suggesting.


I remember adding 16KB or 32KB to get to a total of 48KB of RAM with a card that piggybacked onto the existing chips. Was that the same upgrade that enabled lowercase letters?

I mostly remember that the RAM upgrade changed the gauges in MS Flight Simulator from octagons into rounder circles.


The first computer I ever used was a second-hand Apple ][e my parents brought home one day when I was little. I got started with BASIC on that thing.

I actually didn't realize until I had to use a ][+ at school later that some computers didn't do lower case :)


Interesting... did the ][+ support an add on 80 column card? My hunch is no, but not sure at all


Yeah, it did. Also a Z80 card so you could run CP/M. With CP/M I could run Fortran and C on my ][+, it was a lot easier to get pirated versions of CP/M SW than original Apple Pascal or Fortran.


No, that's not correct. The II+ was a II with Applesoft BASIC instead of Woz's Integer BASIC, and the ability to automatically boot from disk at power on. That's it.

Lowercase and 80 columns were available from third-party vendors for both the II and the II+.

Source: I still own my II. Also:

https://en.wikipedia.org/wiki/Apple_II_series


Worth pointing out (for those interested) that Applesoft BASIC provided floating point support, and was provided by Microsoft.

BASIC and the monitor were stored on ROM chips on the motherboard. The II and II+ were basically the same computer with a different set of BASIC ROM chips installed.

Edit: It is actually possible to have both sets of ROM chips installed if you use a Firmware card. Very handy!


Huh. I got my II+ with the 80-column card installed. I guess I just assumed that was standard. Being more than 40 yrs ago, I could be misremembering.


It was probably a Videx Videoterm:

https://archive.org/details/Videx_Videoterm_Installation_and...

It was common for Apple resellers at the time to sell them with third-party add-ons.


I want to know why the Dragon 32 didn’t have lower case characters but displayed inverse coloured upper case ones instead. Used to play havoc with Basicode 2 programs.


The 6809 and accompanying Motorola 6833 SAM chip in your Dragon that handled the graphics was originally meant to be a video terminal - and uppercase was just fine for that. The later versions of the SAM did have a proper lowercase. Those showed up in later TRS-80 CoCo II but took a bit of prodding to activate.


Actually, the 6883 SAM (synchronous address multiplexer) is the DRAM controller and the 6847 VDG (video display generator) is the one that has the various video modes. See the character set implemented by the internal ROM in figure 20 of https://people.ece.cornell.edu/land/courses/ece4760/ideas/mc...


Your correct! It was the MC6847T1 update that allowed for lowercase.


BECAUSE STEVE LOVED TO YELL!!!!!


Server was hugged by HN it seems.


it seems they had an error connecting to the database for lowercase letters



back then we used to call it slashdotting, what do we call it now?


The HN "hug of death" is one I've heard frequently.


Same, seems to have been used ~644 times (including at least one use of “Reddit (or HN) ‘Hug of Death’”) https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu... (compared to 433 times “slashdotted” has been used here https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...). Edit: The number seems to change with the sort criteria (popularity/date). Weird.


Whatever happened to Slashdot? Did the slowness of new stories appearing + their comments system turn people off eventually, compared to interfaces like here?

I used to read them daily.


10 years ago I scraped a representative slice of all /. postings from the fall of every year. It was apparent that the rate of new UIDs was falling and that the bulk of the posting was done by a limited and shrinking cohort in the range of 100K to 400K. Mobile boosted the prevalence of AC posts but that didn't compensate for the drop off. I ran a regression out and predicted they'd be dead by 2023. Looks like they're on track.


Not being able to up/downvote comments was a major turn off. They'd throw whooping 5 voting credits in your direction every few weeks, but that only served to antagonize people even more. They tried to fix that by giving away meta-mod rights (to let you decide if other people's up/downvoting were fair), but again that wasn't what was needed. So as soon as HN had enough momentum and started to become a viable alternative people left in droves. It was a true exodus. Not really sure who stayed there...


I think the best part about Slashdot in its best days were the people. You could be reading and article about Python and it was possible to see Guido in the comments. And while that was rare, the other upvoted comments were usually high quality from really knowledgable people and the noise was held down pretty well. As the good commentors left, all that was left is the noise. If you go now and look at comments often everything is downvoted excepted for a small number of o.k. comments. But most of the experts and really knowledgable people are gone.

HN seems to have filled some of the gap it left but I don't know of any other site since that has the same volume of really high quality content. I see some comments here blaming the inability for everyone to do metamoderation - but I'd say it was the opposite. When they had decent editors things were great. Your content will be as good as your moderators and I think reddit proves that simply having more of them doesn't assure better content.


After several management shakeups, the editors went off onto some weird tangents with video segments, allowing off-topic posts, and paid content masquerading as stories.

That's when I quit. Why did you stop?


I stopped because they were clearly monetizing content and the comments were overrun by nazis and they clearly didn't care

Edit: I coulda just said the nazis part but some folks here don’t wanna see ads so


/. -> Digg -> Reddit -> HN


Sometimes I feel bad for HN, full of all us fleeing Reddittors


I use both Reddit and HN. HN has much better content (both links and comments), but it has a singular focus on tech/things tech people generally like.

On the other hand, there’s a subreddit for practically every hobby. In many cases, Reddit is home to the largest community related to a hobby. Sometimes things related to my interests outside of tech pop up on here, but the discussion is rarely in depth.



I still read it once a month or so, when I've "finished" HN for the day and bored.


> back then we used to call it slashdotting, what do we call it now?

That was back when slashdot was a real tech site.


Maybe we can start something new!

Attention bomb?

Your suggestions here, go!


still could call it that

Either way, there will be some of us asking, "wut?"


Looks like archive.org itself is offline now.


probably due to lack of money. zero checking. zero savings.


Written by Woz himself!


because 1 button... like the iphone or imouse? /s


why do lower case letters even exist? in 100 years might they be done with?


Some argue that mixed case texts are more readable because the ascenders and descenders provide "synchronization points" for our eyes in scanning text. Also (and less controversially) uppercase is larger so it is slower to read it, because there are more pauses to move your eyes to the beginning of the next line.


There are plenty of languages without upper and lowercase, I've never heard of them being slower to read.


Yes, that's why the first point is more controversial. But the second is interesting: alphabets that lack lowercase, like Korean hangul, have bigger characters but they pack more information into each of them.


thanks for this. but c.f., javascript vs python syntax


Lower-case letters evolved from handwritten Roman letters. Upper-case is exactly carved Roman letters. Mixed-case comes from initials — https://en.wikipedia.org/wiki/Initial


they are way easier to read than all uppercase.


since the responses have mostly been literal (no pun intended) perhaps i should clarify, i meant "case" in general


I am sure this could be answered in a one liner but the article is going to be long as hell


True of most articles but doesn’t automatically mean reading them is wasted time.


Imagine if Woz had had access to a UBI and universal healthcare instead of being super broke and living right on the edge. What could he have created with just a few extra resources? How many would-be Wozs are out there now too scared to spend that $300?


He had a job at HP so no need for UBI (probably would have paid less than HP). Health care costs certainly were not an issue at his young age.

Better instead if he had had some investor cash earlier. But as a hobbyist, he wasn't playing the "long game" anyway, just trying to make something cool for himself and bragging rights at the 'Brew.


OP's comment wasn't about Woz specifically but the population of potential Wozs who, but for a few hundred dollars, could have contributed something great to the world but didn't.

Right now a generation of young people in the United States are completely priced out of going to college, or facing eviction from COVID related layoffs. The slightly older cohort are seeing a much higher proportion of their wages go to high rents and student loan debt than Woz's generation.

As a society, we're are missing out on bottom up innovations because would be inventors are too busy trying to survive than to create.


Doesn't sound like it would have paid much less and he would have had much more free time, but it seems he was so enamored with working at HP he didn't mind getting subsistence wages as an engineer.


Working at HP at that time was a pretty sweet gig. It would have been like getting a job at a FAANG today.


It seems more akin to working at a game company where the companies take advantage of the fact that so many people want to work in the game industry they can keep salaries relatively low.


true... But also at that time you did not make much as a developer anywhere. When I started this in the 90s if you made 50-60k you were the super elite of programmers and had been doing it for 20+ years. Usually if you stayed at a company they would take care of you. One guy I know went the full route at one of those companies and retired in the 90s his retirement was very nice. The dotcom bubble changed everything with respect to pay and long term comp.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: