Showing posts with label computers. Show all posts
Showing posts with label computers. Show all posts

Tuesday, August 02, 2022

The Paperless Office

 I remember when Felix Foss came around to talk about the "paperless office" which implementing the System/36 would enable.

The other day I got on the Facebook group for current and retired FSA employees.  146 messages on the subject of what office equipment should be purchased for the coming year.  It seems that most of the messages concern equipment for handling paper and folders.https://www.facebook.com/groups/54686876198

Thursday, April 08, 2021

Blasts from the Past

Two things, unconnected except they both recalled my past:

The Post is running "Classic Doonesbury". In a recent one Alex, the daughter, is giving her father her Christmas list.  She wants a Pentium PC so she can keep up with her classmates, and a 288 modem.   I want to say I remember my first modem, but I don't.  Could it have been 120/240 baud or 1200/2400 baud?  I definitely remember the big advance up the ladder to a 28.8K baud modem.  I suspect these days few people remember a "baud" (1 bit per second, where 8 bits equal one byte, which was one character).   Back then I was going on-line through Compuserve.  So much has changed since then.

Going back even further in time, at some point in the 1940's-50's our poultry flock was hit with Newcastle disease. We had a run of diseases at that time, leading us to change the hatchery supplying our chicks, so I don't remember how bad it was, how many hens it killed, how many eggs weren't laid.  I do remember the death toll one of the diseases took, taking the dead hens out and tossing them off a hill into a swamp (I know, not good, but that was another time).  

The NYTimes had a Science article on virologists being able to use the  Newcastle virus as a means of inserting a vaccine into humans.  (Newcastle doesn't do much to people.)

Monday, April 13, 2020

DOS and COBOL

Last week I saw references to both COBOL and DOS (see this FCW piece and this piece from Slate); I think both in connection with unemployment insurance systems which are running on ancient software.  I never did much programming with DOS (I was more into WordPerfect macos) but I did take courses in COBOL and did one application as a sideline to my regular job.  The System/36 ran COBOL as did the mainframes in Kansas City.

I can understand why both private and public organizations still run COBOL.  Every change of software runs the risk of creating new problems, so if you've got an application that runs without problems and supports the organization, there's little reason to switch to a newer language.  That's particularly true if the organization is adding new programs or functions, so available people and work hours are needed to support the new.

All that said, the downside of keeping the old programs is you have to live with the old silos and the old thinking, forgoing chances to integrate, and likely forcing you to invent kludges or bridges on occasion. For example, with issuing the federal payments under the current program (CARES), I suspect Treasury had to write new programs to match ID's in IRS files against those in unemployment files.





Monday, August 12, 2019

A Gripe About Dell

The 1 year warranty on my Dell desktop is expiring, so I was looking into an extension.  That caused me to become very unhappy with Dell:

  • when I went to their website, I was able to find an extension for about $42 a year.  The page promised a 15 percent discount for ordering on line, although there was a phone number to extend by phone.
  • the page did not offer any obvious link to a description of what was or was not included in the warranty.
  • when I added it to my shopping cart and tried to check out I couldn't.  On separate days I got the message that the page was no longer available.  One day I got a message saying the code was wrong--something about the length of the HTTPS header exceeding 8140 bytes.
  • there was no apparent way to contact Dell about the website problem.
  • when I called the support line, I explained my problem to four separate people (each one very nice, and the first three transferring me to someone they thought could help)
  • the last person got me so mad that I forget what his explanation was--IIRC he seemed to be saying the problem was known. Although the web page said my warranty expired on the 12th, he claimed it was actually the 11th.  
  • after a day to cool off, I called the number on the web page.  The woman attempted to explain the elements of the warranty and gave me a price of $350+ for 3 years extension.  I asked for something in writing, which she promptly sent to me.
  • the Dell explanation of its warranty service was long and legalistic.  I understand why--trying to cover all legalities in all the states, but what I really wanted was something more sales-oriented, a chart showing the different options (the guy from yesterday seemed to say there were different levels of support) and their cost.
Bottom line:  while the people were polite and did their best, I conclude Dell makes them work within a flawed system, which will cause me to think seriously about a different vendor for my next desktop.  Meanwhile, I'll take my chances with no warranty--if I need help, which I usually don't, I'll pay for support for that episode.

Monday, August 13, 2018

Sometimes I'm Stupid

Although I'm not sure whether it's plain stupidity, impatience, or stress.

As I posted yesterday, I bought a new PC on Saturday, since the old one was giving the blue screen of death.  What I missed, what was stupid, was the fact that the good people at Microsoft had a QR code (like a 2d bar code) associated with the blue screen and error message.  Finally woke up to the fact today.  I had, fortunately, taken a picture of the screen and QR code on Friday, so I did a google search for the image--found it and an explanation of the error code.

Now I'm not sure when I follow up on the error code I'll find a cause which shows I was too hasty in buying the new PC.  But it does make me feel stupid.

Thursday, December 07, 2017

How Times Have Changed: Test Data

The Times had an article on the theft by three Homeland Security employees of a set of personal data of DHS employees.

What were they going to do with the data?

Well, they were going to write software, or rather copy  and modify the IG's software for managing IG cases and sell it to other IG's.  And the stolen data was going to be used to test the software as they developed it.

What a change in 30 years.  Back in the 1980's and early 90's I very casually moved around sets of live data saved from county office systems to serve as the basis for testing new software.  While we had the Privacy Act requirements, we weren't really conscious of privacy restrictions and security.  Consequently I, and others, could do then what would be firing offenses today.

Saturday, June 18, 2016

Wednesday, November 25, 2015

Fortran?? Really, Fortran

FCW has a post on supporting Fortran, by "accommodating the legacy code with an open-source Fortran compiler to help integrate the programming language into a larger pool of computer languages in supercomputers."

Fortran was old when I was learning COBOL back in the 70's.  And most of the people in the US have never heard of either, too young.

Tuesday, November 17, 2015

Feeling Old With Windows. 10.0

Upgraded my desktop PC to Windows 10.  The Microsoft people are getting better at transitions--practice makes almost perfect I guess.

We've come a long way since the days of DOS and the command line.

Monday, September 21, 2015

The Turing Test and Humans

The Turing  test is the famous  method for determining whether computers can think--can the computer's conversation with a person be so good it can't be distinguished from that of a human?

There was a piece I read today discussing other tests for distinguishing computers and humans.  But I want to discuss going the other way--distinguishing humans.  I'd suggest the only way to distinguish humans from other entities, whether they be computers or chimpanzees, is the genetic one.  By that I mean that a human is born of another human and contains DNA from one or more humans.

When you expand your mental image of "human" from a mature adult to include infants and the mentally and physically challenge I don't think there's a reliable performance test. The reverse Turing test doesn't work--many humans cannot converse, a few have no language at all.  So I think, rather than performance, the only test of humanity is the genetic history.

Tuesday, August 11, 2015

Kevin Williamson Is Wrong: Foreseeing the Future

I'm nitpicking here. He writes at the National Review:
"No one in 1985 knew, or really could have known, what computers would be like ten years down the road, or twenty."
(It's in the context of mocking a NYTimes columnist in 1985 who wrote that laptops were a bad idea, and moving from that  to the idea we can't foresee the future so the market beats government.)

Now I remember old laptops. We had a Zenith laptop at work which we took to a training session.  Actually, it wasn't a computer to put on your lap--it was a portable computer, a luggable.   I also remember something else, something called an electronic calculator.  When I worked at my summer job in the summer of 1959 and later, I used an old handcrank manual adding machine. By the end of the 60's electronic calculators had arrived on the scene, and by the end of the 70's we had programmable calculators.  Innovators in county ASCS offices had started to buy the calculators and program them to compute program payments and loan amounts.  I remember a GAO report urging the agency to establish centralized control over them.

Anyway, no more memories.  My point is that by 1985 we had seen the effects of Moore's law; the capabilities of calculators had exploded and their prices had imploded.  We also had seen the progression from mainframes to minis to micro/PCs.  So anyone with any sense of the history of the past 20 years would have known that computers were going to get smaller and more capable.

And someone, like Al Gore, was on the verge of inventing the Internet, or at least see that an obscure military/academic tool needed to be opened to the public.


Monday, June 29, 2015

Programming Languages and System Development

It's almost 40 years since my first programming courses.  I never got paid for programming, but I did find ways, by stretching my job responsibilities, to do some programming during work hours, or after.  My first language was, of course, COBOL.  I also did a very little Powerbuilder, some Javascript, and a lot of Wordperfect macros.  But that ended almost 20 years ago, so there's been a lot of changes since.  I read stuff, and see references to Python, and PHP, and Github, and wonder what the hell?

So I really enjoyed this very very very long post. It told me just enough about current times, even though I had to split my reading over 3 days.  A whole lot has changed, no mention of "waterfall development", no mention of James Martin, etc. etc. but some things haven't, as witness this quote.

"Most of your programming life will be spent trying to figure out what broke, and if the computer helps you, maybe you can watch your kids play soccer."


Friday, November 28, 2014

Memory and Reality

Saw somewhere a description of a study of how well Americans remember their Presidents.  The bottom line was that we remembered the first 4, Lincoln/Johnson/Grant; FDR and not the ones in between.  The explanation was that memory is refreshed by usage--if we don't have occasion to recal Polk, we won't remember him.

That makes sense I guess, but there's also another phenomenon going on; the accumulation of true and not so true memories around certain figures.  It's something of a geological provision, some figures are built up and some torn down.

As it happens, there seems to me to be an example in A.O. Scott's review today of the new biopic on Alan Turing.  Turing is a figure who is becoming more and more prominent, partially for good reasons--his contributions to the theory of computing and to British code-breaking in WWII--and partially for understandable reasons: his homosexuality and tragic fate.  But IMHO he's getting props which are undeserved as well.  Scott writes:
" There are lines of dialogue that sound either anachronistic or — it may amount to the same thing — prophetic. It is thrilling and strange to hear the words “digital computer” uttered a half-century before any such thing existed,.... [emphasis added]
This puts him 50 years ahead of the game which isn't true.  The first mention of "digital computer" in Google ngrams is in 1940, which  is roughly when the first digital computers were being built, perhaps 4 years after Turing's big publication. There's controversy over the definitions here, but the bottomline is several people were working in the field.  But 100 years from now Turing will be remembered as the inventor of the computer just as Edison is remembered as the inventor of the light bulb.

Monday, April 07, 2014

50th Anniversaries: IBM 360

Here's a long piece on the IBM 360, which celebrates its 50th anniversary, at least of its announcement, this year.  This wikipedia article says there's no working 360 in existence. 

The trip down memory lane naturally led me to this wikipedia article on COBOL, which I see is still around and kicking.




Friday, December 13, 2013

COBOL Lives!

So says the FCW, in this article.

What really surprised me was not the continuing use of COBOL in legacy applications, but the fact that a quarter of colleges still teach COBOL and for some it's still a required subject.  I would have thought that COBOL was so old-fashioned and unappealing that it would have died out in the realms of academia, even though there's still a need for people who know it.

For legacy work, I suspect there's still things where it works pretty well.  Consider the example of payrolls, one of the early applications of computers.  You do payrolls every two weeks, or every month, which means batch processing must work okay.  No need for fancier languages which support objects or whatever is today's hot concept. 

I started programming in COBOL back when I was disillusioned with my bureaucratic career.  Then, after I stayed in the bureaucracy, I got quite good with WordPerfect macros, back before the WYSIWYG days.  Finally I did some Javascript in the mid 90's.  But these days Python seems well beyond me, and not something useful.  It's a shame; there was a rush of satisfaction every time you completed something and ran a test and it worked correctly.  Of course, that rush was usually followed by the frustration of failure when the next test bombed. 

Did anyone notice that Google had a tribute to Rear Admiral Grace Hopper, one of the mothers of COBOL?

Thursday, May 09, 2013

John Dvorak's Rule

Used to be, according to Dvorak who was a columnist for a PC mag (either PC or Byte), the PC you wanted cost $3,000.  That rule is long gone.

Wednesday, February 27, 2013

COBOL and Binary

Back in the dark ages when I learned COBOL, the prerequisite was a course on computer basics, including number systems, binary, hex, etc..  Which is why I unabashedly steal this joke from James Fallows:


"There are only 10 types of people in the world: those who understand binary, and those who don't."

Sunday, October 07, 2012

The 8 Inch Floppy

Govloop has this post, with a very young Bill Gates balancing a floppy disk on his finger.  When I first saw it, I thought it was an 8 incher, but it's more likely a 5 1/4 one.  As an 8 incher, it brought back memories of the IBM System/36, the minicomputer which ASCS used to automate its operations. 

(Going even further back, in the early 70's there was a pilot project to put remote terminals in county offices.  The storage at that time was an IBM 7.5 meg disk drive.)

Thursday, June 28, 2012

Minitel and Compuserve

The Times has a story on the impending demise in France of Minitel. Minitel was once the very popular French version of the Internet, or rather an intranet since it was all proprietary hardware and software.  The French were way ahead of the rest of the world with computerization in the home.  The U.S. had some experiments, which failed, one of which was by Time-Warner, but the French developed such a widespread platform even Norwegian bachelor farmers in Brittainy adopted it, using it to maintain the registrations of their cows, etc. 

But since it was proprietary and not open, it's lost out in the competition with the Internet and PC's, lost out at least in the marketplace if not in the hearts of some of those aforesaid farmers.

Compare France with the U.S.  Compuserve was an early networking outfit, but because we already had PC's penetrating the market it was software only; the hardware was PC's.  Compuserve was eventually ousted and then bought by AOL, which reached for the stars in merging with Time-Warner, only to fail in competition with the open interface of Internet browsers.  Sic transit gloria mundi.

Friday, June 15, 2012

Catnip Topics: PC Hardware

There are some topics on popular blogs which the readers will react to as cats react to catnip.  It's not a pretty sight.  One such topic is advice on PC hardware--Kevin Drum asked for comment on a report that some Mac user replaced the computer hard drive 3 times in 2.5 years.

His commentators rose to the topic, notably competing for the title of whose first computer was the oldest and smallest.  Not sure why, though it's probably the same logic why us geezers talk about how hard our life was compared to today.