What was the most technically advanced decade?

Par hamlet

Scribe (2868)

Portrait de hamlet

01-01-2020, 22:45

What do you think?
What was the most technically advanced decade in computer history.
Nowadays MHz doesn't matter like they does in the 90's, memory doesn't grown in sizes like it did in the 80's.
I'm sure we are just at the beginning, but when did this journey begun, in which decade has been the most progress until?

!login ou Inscrivez-vous pour poster

Par Meits

Scribe (5734)

Portrait de Meits

01-01-2020, 23:38

Didn't it begin with Turing's enigma decoding computer in the early 40's?
In the 80's when homecomputers became a thing, the hardware was'n that much different between model a and model b. A lot of them ran a z80 and no machine was absurdly stronger than the other.
My guess is that during the 90's things took off in a bizarre fashion. I remember 286 machines at friends' home in the early 90s and a Pentium III on my Aopen AX6BC in 1999. Imho that leap was bigger than the one between Turing and the first home computers (but I might be wrong Wink I didn't open wikipedia on anything for now Big smile ).
The last decade didn't do a lot on performance but more on energy efficiency.

Par mcolom

Resident (57)

Portrait de mcolom

01-01-2020, 23:47

I think it was the 70's with the start of microelectronics, that made possible to build the machines that Ada Lovelace of Ramon Llull imagined and designed. And to have them at home to play games with the MSX! Big smile
The second one is very late 80s, when the back-propagation applied to deep-learning networks (LeCun et al.) combined with big data started to produce to give something that resembles artificial intelligence.

Par hamlet

Scribe (2868)

Portrait de hamlet

02-01-2020, 10:25

It seems every decade had its glory:
40: faster than human brains
50s: AD -from analog to digital
60s: far away -gone to the moon
70s: bring it home -homecomputer
80s: kb -Memory Wars
90s: MHz -from Mega to Giga
00s: portability -laptop
10s: efficiency -ipad
20s:

Par nikodr

Paladin (728)

Portrait de nikodr

02-01-2020, 12:30

I think we still live in the age of transistors. So for me the most advanced decade was when the first basic transistor appeared.
Soon everything changed. Anything has to do with transistors.
The decade it appeared everything changed.

Par ducasp

Master (187)

Portrait de ducasp

02-01-2020, 12:56

My thought on this is that we are still facing a quite quick progress but this is ramping up to multiple cores (be it graphical or cpu or memory channels), parallelism like our brains do... But, still most people have a hard time to properly understand multithread nevertheless, much less incentive to squeeze every drop of performance you can out of current chips. We are in the "good enough for my needs" Era, and most new programmers will cry a river facing a limited ram/cpu scenario.

Par Meits

Scribe (5734)

Portrait de Meits

02-01-2020, 13:32

ducasp wrote:

most new programmers will cry a river facing a limited ram/cpu scenario.

They should search for another job/hobby then. I'm amazed that small tools that would take some Kilobytes on an 80s computer do need several Megabytes these days. Total lack of skills (or time given by the boss).

Par PingPong

Prophet (3499)

Portrait de PingPong

02-01-2020, 14:41

Fully Agree. today even a newbie can name itself as sw developer.
But a sw developer is something different that a people that blindly execute bash commands and do copy-paste of source code from internet.

Par konamiman

Paragon (1051)

Portrait de konamiman

02-01-2020, 14:59

hamlet wrote:

90s: MHz -from Mega to Giga
00s: portability -laptop
10s: efficiency -ipad

Another interesting perspective is to look not at what was invented in each decade, but what was popularized:

90s: personal computers become commodities
00s: the explosion of Internet
10s: the decade of the smartphone

Under this view I think that the IoT world has a lot of interesting things to offer in the 20s.

Par Palver

Rookie (30)

Portrait de Palver

10-01-2020, 16:02

To me, the most technological challenge was the arrival of the man to the moon. From 1972 nobody has been back, and what is worst, at the moment nobody is able to do it again. If you look at the load capacity of the biggest launchers you will see the progression:

1-Saturn V: 60's. 140 Tons
2-Energia: 80's. 100 Tons
3-Falcon Heavy: Present time. 64 Tons

So, the 60's ruled. It looks like we are going backwards...

Par wbahnassi

Resident (48)

Portrait de wbahnassi

10-01-2020, 18:40

Meits][quote=ducasp wrote:

I'm amazed that small tools that would take some Kilobytes on an 80s computer do need several Megabytes these days. Total lack of skills (or time given by the boss).

Hehe. And you're not even counting the DLLs that carry even more functions you use (maybe unknowingly) in modern programs. So if you trace those too, we're talking much more.

I'd say skill is part of it, but it's the target platform and toolchain that bite us nowadays. For example, getting rid of the MSVC libraries is a tough battle that eventually shrinks down your EXE size and DLL dependencies to below 50KB. But of course you lose CRT and C++ STL (maybe for the better Tongue )

What I really blame though is the advancements in hacking techniques. Recent compilers fill your EXEs with ****load of security mechanisms to prevent things like buffer overruns and data-execution...etc. It's a never ending battle to always develop counter measures, and those counter measures take a toll on your programs. They inflate them and slow them down. So yeah computers are faster but they're spending a lot of their power on useless crap because 1% of the population are looking to hack into your computer. That's why systems like MSX are lovely. Innocent, simple and clean Smile

Anyone heard of a virus on MSX?