• Please review our updated Terms and Rules here

Why schools don't want old computers

tezza

Veteran Member
Joined
Oct 1, 2007
Messages
4,731
Location
New Zealand
A few threads have touched on the use of vintage computers in schools. Rather than hijack those, I thought I'd start a new one.

I think the reasons why vintage computers have less use in schools is twofold..

1. The "perception" factor. Vintage gear gives the perception that the school is poor, and not up with the times. This has little to do with quality of LEARNING of course (that's more to do with the teachers and curriculum), but may be important for attracting perspective pupils.

2. Computers play a different role in education now compared to say, the 1980s. In the early-mid 1980s, computer literacy was mainly about knowing how the beast worked (through programming). From say 1985 to 1995 computer literacy was about knowing how to wordprocess, drive a spreadsheet or set up and use a database. Now, "computer literacy" is about knowing how to use the computer as a commumication tool and knowlege portal.

That means using the Internet with all it's multimedia resources (video etc), and for that you need a modern computer.

Tez
 
Everyone is brain washed in to thinking thay need a 2.6 Quad Core to surf the net, or watch a DVD. And in order to be "cool" you have to have that 2.6 Quad core.

A load of carp, I remember my Dell 450MHz, with one of the 1st DVD-ROM's did just as good!




A few threads have touched on the use of vintage computers in schools. Rather than hijack those, I thought I'd start a new one.

I think the reasons why vintage computers have less use in schools is twofold..

1. The "perception" factor. Vintage gear gives the perception that the school is poor, and not up with the times. This has little to do with quality of LEARNING of course (that's more to do with the teachers and curriculum), but may be important for attracting perspective pupils.

2. Computers play a different role in education now compared to say, the 1980s. In the early-mid 1980s, computer literacy was mainly about knowing how the beast worked (through programming). From say 1985 to 1995 computer literacy was about knowing how to wordprocess, drive a spreadsheet or set up and use a database. Now, "computer literacy" is about knowing how to use the computer as a commumication tool and knowlege portal.

That means using the Internet with all it's multimedia resources (video etc), and for that you need a modern computer.

Tez
 
That means using the Internet with all it's multimedia resources (video etc), and for that you need a modern computer.

Tez

Perhaps I'd better define "modern" here. I mean a P2/P3 and upwards as opposed to a P1/486 (and downwards).

Tez
 
Everyone is brain washed in to thinking thay need a 2.6 Quad Core to surf the net, or watch a DVD. And in order to be "cool" you have to have that 2.6 Quad core.

A load of carp, I remember my Dell 450MHz, with one of the 1st DVD-ROM's did just as good!

The programs I run require alot :+>
 
2. Computers play a different role in education now compared to say, the 1980s. In the early-mid 1980s, computer literacy was mainly about knowing how the beast worked (through programming). From say 1985 to 1995 computer literacy was about knowing how to wordprocess, drive a spreadsheet or set up and use a database. Now, "computer literacy" is about knowing how to use the computer as a commumication tool and knowlege portal.

Yes, so true. And in the near future (5-10 years), they may not even want or need computers in schools as personal information and communication devices may evolve past the school even needing to provide them anymore. Well, let me say that differently. Our kids will be going to school with their own information/communication devices as a requirement for school, just like pencils and note books. Notice this means such devices will be really cheap - like calculators. The point is, people won't view them as being computers anymore. Just devices as ubiquitous as a pencil (and no, I do not believe such devices will replace the pencil). Not another 10-20 years at least.


I half seriously told my boys yesterday (age 11 and 13) that their next computer would be something out of the '80's. I almost think they would learn more about computers from an old TRS-80, Apple II, or early DOS machine than trying to use anything modern. The reason is there was simply not that much to do on the older computers (relatively speaking). So you went searching for new stuff, and learned as much as you could. But today, there are so many things to do on a computer that it's easy to get distracted and never focus on a single path (well, except maybe games which all kids like to play to excess). Maybe an older computer would focus their interest. Ok, not really, but it's a nice thought.

I'm not saying the good old days of computers were better. They weren't. They were just perhaps more exciting, and that made having a computer in school in the early '80's a really fantastic thing. I enjoy my newer computers a great deal. I mean, my god, my Macbook Pro is a laptop, runs Unix, Mac OS, Windows, plays decent games and does everything else a computer can in a 5 lbs package. I would have died for such an amazing machine in the 80's and early 90's. But today, nothing really excites me about a modern computer.

Today, computers are not a passion (again, relative to the passion people felt for computers in the 80's). Video games are a passion, but not computers. The Internet isn't even a passion. It's just cool (and a good way to waste time). But it is the constant upgrading of the Internet that makes older computers useless now.

On the bright side, aside from video games, any Pentium 4 and beyond is a very capable machine for what schools need, and such older machines are a dime a dozen now. The down side is, what school has the time or people to deal with re-assembling/fixing older PC's and installing software on them. It's so much easier to buy them pre-installed. This may be the bigger issue with older computers in schools.
 
The programs I run require alot :+>
Do you have a program to tell you alot should be two separate words? :D

2. Computers play a different role in education now compared to say, the 1980s. In the early-mid 1980s, computer literacy was mainly about knowing how the beast worked (through programming). From say 1985 to 1995 computer literacy was about knowing how to wordprocess, drive a spreadsheet or set up and use a database. Now, "computer literacy" is about knowing how to use the computer as a commumication tool and knowlege portal.
About computers, I often wonder what they actually teach the students. Is it more than 'point-and-click?' I think they should first learn some basic commands, what the BIOS is, making batch files, etc. Maybe then they won't be bothering poor Rajib in India so much. ;)
 
Quote:
2. Computers play a different role in education now compared to say, the 1980s. In the early-mid 1980s, computer literacy was mainly about knowing how the beast worked (through programming). From say 1985 to 1995 computer literacy was about knowing how to wordprocess, drive a spreadsheet or set up and use a database. Now, "computer literacy" is about knowing how to use the computer as a commumication tool and knowlege portal.
Certainly in the UK computer education is separated into IT or ICT (Information and communication technology) which is word processing, spreadsheets, and database stuff (More point-and-click), or Computing, which involves (at least at the college level) Programming, networking, binary and hex sums and learning how the computer works in medium detail (Think block diagrams of system architectures and the like) The term computer literacy is subjective. Someone who builds computers from scratch may not consider someone who can run their office from the PC as computer literate if they cannot handle simple errors or troubleshooting, yet the latter may have a certificate from an organisation or educational establishment that tells the populace that they are.
 
Very good point Tezza, and BBCmicro. When one sits down and actually charts out the 'computers' field into sub fields of knowledge or expertise, there can be many layers. Hardware or software? If Hardware, what type? Computers themselves, network appliances, storage? Actual physical assembly of them, or how they work/electronic engineering? And so on...
Software - as someone mentioned above, programming or applications, specialized software, OS's etc? Web programming, platform specific programming? Object oriented? Subfields after subfields with enormous amounts of information in each one even as you delve further.
I remember being taught on the II+/IIe in school. Our teacher really didn't know prodos or logo or pascal any better than we did :) He tried though. We learned a bit of programming/logic/algorithms, the applications of the day and platform, and games. Those of us who were interested took it further on our own, or pursued it in post secondary. Computers were computers, they didn't really affect any other classes except perhaps math depending on how creative a teacher you had Now, computers are ingrained into everything we do, involved with almost every class as a vehicle for said class, not just computing for computings sake. The dynamic has changed, but its too bad we couldn't hold on to teaching more people just a few more fundamentals about the tools they use every day :)
 
Basically, it boils down to this;

1) The companies are working to make hardware and software idiot-proof so that NO knowledge of any kind is required to accomplish any task.

2) The universe is working to make a better idiot.

3) The universe is winning.
 
First of all, the two basic parts:

1. Learning about computers
2. Using computers

I can't really imagine why, but the two get terribly confused. For example, I personally am very interested in paper history and the technology of paper making, but that to me is only remotely related to reading, writing, and everyday uses of paper. In the same way, just because a school uses computers to manage information and write or produce assignments, does not mean that they are learning about computers.

I do think that computers are important enough to modern society that a little education about them is in order. That is the part which is missing in schools. Few kids could mention 5 operating systems out of the hundreds in use today. It is likely that they don't even have a clear idea of what an OS is, or could be. And could they define what a computer is, other than mentioning some commercial products? I'm not saying that there are clear answers to this. I am saying that it is a subject to be studied. We are talking about schools - right?

I think that a curriculum put together by senior CS professors based on knowledge, experience, and perspective, is needed. This is what happens in other subjects. The lower grade teachers themselves don't decide what constitutes modern math, for example. They have a curriculum to follow which is written by eminent academics in the subject. That is the only way to guarantee quality and keep the bar high enough for successful education. In fact with most subjects we take that process for granted. But not with computers.

What I've noticed is that most teachers and school officials, seem to believe that they themselves are those experts. I'm embarassed for them. They should know better than to put on such and academic scam. In any other field they would be called on it and probably fired. That, in my opinion, is the core of the problem.
 
Very true...its not just confined to schools either unfortunately. My company had a sales conference this last year as they usually do. Since the company owners and the President/VP types are there, they haul an engineer (me) in to babysit their sound system/Powerpoint/etc for the day instead of conscripting one of our regular board ops. So, while they have me sitting there cooling my heels when I could be doing actual work...but I digress. Thats a whole 'nother rant :)

A great deal of focus has been placed in the last year or two on the web presence of our company, and so there are several 'experts' that have sprung up in the company as a result. I am not a web programmer or designer. I could code some mean HTML/Javascript etc a few years ago, and thats it. I do know enough to know how much I DON"T know about it though. I also am reasonably familiar with the networking and infrastructure involved.

The VP of sales for the company got up at the end of the day and made a little closing speech. It was embarrassing. Our chief engineer (my boss) had stopped by to give me a hand breaking down equipment when it was done, and he also caught the speech with me. I'm not going to bother relating the VP's misinformation, or the depth of his ignorance, but suffice to say he should not have made much of that speech to a room full of people who knew the same or less than he did about the internet. I am not saying he should be an expert. All I'm saying is that I would never get up at a conference held by another department and talk about what they do...at least not without having my information double checked by someone. Scratch that. I wouldn't do it. I know enough about our sales department for example to be dangerous, and thats it. I have no business giving speeches about it, and I'm sure nobody would tolerate it or take it seriously if I did. For some reason though, it seems to be ok for anyone and everyone to get up and speak about technology as if they are an expert. Thats one of many examples I could mention just with my company
Whoa...that was a bit of a rant :) Sorry. I just twigged on to what Ole Juul said about teachers and school officials considering themselves experts.
 
Last edited:
That used to bug me. Introduction to computers is good, and any technology could be used, not like new computers don't have most of the same parts everything else did. I was impressed with a growing number of schools teaching A+ (CompTIA) certification or network+ which if up to date is a nice "yes I know a lot of the basics" test.

But for typing classes, etc (dunno if they do these anymore or just assume everyone HAS to own a computer) it's not like anyone is going to be typing faster than the keyboard scanrate of an Apple II or 8088, so there I don't know if there's a major point to using high-end computers in school. The only advantage (questionable vs cost) is letting the students learn current software. Although I learned mostly DOS (Works 4.x, WP 5.1, etc) applications over a networked IBM ps/2 with no hard drive. That was when Windows 95/98 was already out.
 
Computers are like cars these days, few people have a clue how they work inside because they do not need to.

You do not need to teach kids how to use a computer, just like you don't have to teach them how to use a cell phone or a game console (they are born into that tech).

If it was up to me kids would not get their hands on a computer in school untill the last few years of highschool. Everything should be taught using books, paper, and pencil. Computers just hide all the intermediate steps involved between the problem and the answer. You learn more by doing, not plugging in data and hitting the calculate button.
 
Companies like Dell have a lot to do with the problem. My university has contracted an agreement with Dell: each incoming student must buy a laptop, whereby you get a discount to buy a Dell laptop through the university. We have dismantled one computer lab already in an effort to save money, since the cost of the computers has been offloaded to the students. Now of course the students are buying the best laptops, and this enables us to use software packages that require the best laptops, therefore making the remaining slow computers on campus all but useless. Dell really knows what they are doing, and it bothers me. Good for Dell, bad for us.
 
I just gave away a 800mhz computer with open office w2k and a dvd player. I found the thing trashed because it was 'to old' but now it is a perfectly working school computer. Fast enough to play flash games, you tube videos and dvds. It's just that, "OMG! That computer's from 99! What the heck could it possably do????"
 
Maybe a certain part of the problem is consumer ignorance. I could see parents and general public that don't understand anything about computers making an ignorant assumption that you can't possible use these old computers in the school. "We have to have the fastest computers and current technology. Otherwise how are our kids going to learn qbasic and pascal?!"

So it could be an appearance game coupled with the corporate computer seller contracts and trade-in upgrades causing a misleading concept on what a computer can be used for. I mean really.. if I'm programming in asm, guess what doesn't make a difference to me? lol
 
Computers are like cars these days, few people have a clue how they work inside because they do not need to.

You do not need to teach kids how to use a computer, just like you don't have to teach them how to use a cell phone or a game console (they are born into that tech).

Yes, I agree. They are now consumables, and unless people want to specialise in IT (specialisation being a function of colleges and Universities, not schools) then there is no need to teach children how they work to prepare them for society in general. The software makes it intuitive. They are like a car, fridge, TV or any other applicance. If children are taught at all, it is WHAT to do with them rather then ABOUT them, if you see what I mean.

Computers have so many layers separating the user from what actually goes on, you don't really have to KNOW about them to use them. This is a big contrast to the early 1980s and before.

Tez
 
Back
Top