Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is a really interesting generation gap issue in the replies to your comment. What I perceive as younger people are horrified at the idea of fifty year old tools while the older folks are thinking (I imagine) “if the tools have lasted that long they must be well-honed and very good”.

Of course, this could simply be the perspective of someone turning 50 this year.



People here also don't understand machine shops.

My dad ran a job shop focused on small jobs and the economics are different.

A lot of his work was keeping other local shops / industrial equipment up and running. So there is a lot of variety of work but very low throughput and kind of by deffintion you have the capabilities to fix your own machines.

Programing a CNC machine makes it east to make a lot of the same part but if you only need one it may be quicker to just knock it out manually.

A 50 year old mill or lathe is easy to keep up and running, can be upgraded with a Digital readout or even CNC controls if desired. A tool in a shop like this likely won't see the cycles one on a factory floor constant uses sees but may be worth keeping around since it offers a unique capability...he had a large ww ii surplus lathe for jobs that wouldn't fit on the smaller more modern machines for example.


> What I perceive as younger people are horrified at the idea of fifty year old tools

My students are shocked (horrified?) to learn that they're basically running 50-yr old Fortran code when they use scipy.minimize to train their fancy little neural nets.


I always chuckle at how Python became the dominant language for AI / ML / data science etc but wonder why it is that Fortran and Python became the golden combo, it could have been done with any other language, no complaints, I love Python, but its just amusing to me.


For Fortran, I have the concept of 'immortal code' - code that is generally hard to write, is compatible with everything, and implements an algorithm in a way that's impossible to improve on - or at least doing so would be celebrated as a minor breakthrough.

A lot of numerical optimization code is this - it conforms to a strict 'C' ABI - taking (arrays of) simple floats and ints and outputting the same, so binding it to another higher level language is trivial, so rewriting it makes little sense. If the same algorithm were written in Java, most people would not want to bring in Java as a dependency for their Python/C++/whatever project, but since this is just a tiny C object file, it's happily integrated into everything.

They also tend to be very tricky to get right, I remember reading a paper where the author was adamant that changing the order of a multiply and an add (a mathematically invariant operation) would cause the algorithm to blow up due to the different scales of floating point values involved causing a major loss of precision. I'm sure there's tons of stories like this.

This is the sort of code which took PhD's who studied this exact topic years to get right, even though the actual code often looks unassuming, I would dread the day when I was required to touch it (but I never do - since it always does what it says on the tin)


Swapping addition and multiplication on floating point numbers fails to be invariant.

You are thinking of real numbers, for example as all Cauchy sequences with fractions for coefficients. That is not what a CPU does.


Fortran has been a research language used by people like physicists and mathematicians that know what they're doing and have been developing that tool for decades and Python is perfect for people who have no idea what they're doing.


It's really just that it's a pretty easy language to learn that finds a good balance between structure and brevity. It has very good support for the data structures you need and libraries for everything. A lot of people love the language and that built up a lot of momentum and eventually people started adding stuff like numpy and scipy and pandas and before long you had this giant scientific computing environment that almost anyone can get into.

I tried out most of the scripting languages out there (Ruby, Perl, Tcl, Groovy, R, and many more) and Python just seemed to click more and it has a whole lot less to worry about upfront than languages like C# and Java. In comparison to languages like C and C++, it's a godsend for those with typical automation needs.

In my eyes it seems like a pretty straightforward development. There have been plenty of other tools that may have made sense throughout history too. Matlab could have done this, but by that time nobody was going to build out massive libraries for something expensive and partly closed off.


Python is one of my all time favorites, it is quirky, but nowhere near as quirky as JavaScript.


NumPy was pretty early, and it hit the sweet spot for science: free, fast, with a quick and dirty scripting language around it.

Then the whole ecosystem that Python has nowadays was built on NumPy, and sending data from one library to another was trivial because of it.

That's it.


Yeah Python is older than Linux or HTTP. It was virtually unknown in the 1990s, though.

Funny how it’s getting its time in the sun now.


I found R a better language for Data Science, somehow Python won out.


People don’t generally care about the best language. They care about a passable language with the best library access and great support (either commercially or via the community, but preferably both).

I’m not even fond of Python, but I use it sometimes because things exist for it to use and because there are lots of developers around who can share the work on the code. If I write something in R, APL, Julia, OCaml, Forth, Scheme, or Lisp at work I’m going to be the only maintainer and will probably get a stern lecture about that. If I use Perl, Ruby, Java, or PHP, there are a few more people but it’d better be code only my team has to maintain. Go, Rust, C, C++, TypeScript, maybe JavaScript, and Python are safe in most of the company’s codebases but only C, C++, or Rust for the code that needs the most performance and the most stability of resource use.


> People don’t generally care about the best language. They care about a passable language with the best library access and great support (either commercially or via the community, but preferably both).

Moreover, to the extent that they care about the “best” (or at least “better” within the scope of options with suitable ecosystems) language, “best”/“better” is highly subjective and shaped very much by familiarity


R is an entire language based around the idea of making code as unreadable as possible by means of bizarrely stacked syntactic sugar. It shouldn't surprise anyone that it didn't win out.


No one uses scipy for anything serious anymore in AI research or on any type of modern models.

You're setting your students up for failure if this is how you are teaching neural networks to them. You should switch to pytorch or at least something like tensorflow or jax or you are actively doing intellectual disservice to them and leading them to be actively noncompetitive both in AI academic paper writing/grad school and in the job market.

Similarly, use of sklearn considered harmful in a world where CuML/CuPY/Nvidia RAPIDS exists.

And also, knowledge of conda/pip/poetry considered harmful in a world where UV exists.

Teach MODERN tools please.


It sounds like a fundamental education failure and/or an extreme failure of tool design and training in workplaces if new employees with relevant degrees don't have the basic knowledge to pick up new tools on the job. A liberal arts education is for teaching basic transferrable skills, and is not a narrowly tailored job training program.


Oh my. Teach something that lasts please.

In 2030, pytorch and uv will be so 2025.


I'm somewhere in the middle, young enough that almost everything I've seen new is disposable crap (including the tools), old enough that I have had an interest in things from before and noticed that they really were built much better, or at least heavier, back then.

I've made the comment on here before that I believe it's short term energy optimisation, in that it used to be seen as reasonable to much heavier objects around. We've made everything so light we've lost the infrastructure for moving heavy stuff around when we might need to.

Kids today have no concept of how heavy workstations, TVs or monitors used to be, and they think it's exaggeration. Let alone tools, cars, appliances etc.


Yes.. try moving that 36 inch Sony Trinitron from the car inside the house.. weighs 200 pounds+ IIRC....you need at least 2 strong people


I remember the Sun monitors (21" I think) were about 80 lbs. I've read that part of that was a metal frame to hold all of the wires in front of the screen.

They were fun monitors - we had a lab full of them, they would degauss on startup, and the degausser would induct into the monitor next to it (and a little bit into the monitor after that).


The weight from a CRT is mostly about the amount of glass required to keep the atmosphere out, as it's essentially a vacuum bottle with better marketing. On that 21" display, you've got about 6000 pounds of force trying to push the face inward, not to mention the sides, neck, etc.


Yeah, these monitors were a bit heavier than most consumer 21" monitors at the time. We also got Viewsonic monitors for PCs, which were lighter. At the time, I had assumed the additional weight was from extra shielding, but I later read that some of the weight difference was a metal frame holding the wires. The trinitron had a bunch of vertical wires instead of a grid of holes on the front - If I remember right, they'd shimmer a little if you smacked the side of the monitor with your hand.

It's possible they also had more glass than typical for a 21" monitor, I don't recall if they were any flatter than the Viewsonics or not.


Right. Bought two out of a University lab and lived in an apartment six stories high with only stairs. Moved out those monitors even and they were more difficult to get down than the couch...

Don't let me get started about fixed frequency, X11 modeline guessing (wrong of course) and needing a second monitor to even get back to the original config.


BWAAAA

I wish I still had one around, just to degauss it occasionally.


Cars haven't gotten any lighter. Rather the reverse. Battery packs are quite heavy.


Even non-EVs have gotten quite a bit heavier due to the inclusion of more structural safety features and creature comforts.


It's also due to the size of the vehicles that are popular today. SUVs and pickup trucks(used as family vehicles).

However the increase has also been offset with weight savings in other places.

- The use of aluminum in suspension components and body panels

- Long ago the move to unibody over body on frame for small cars

- smaller engines, V8s weigh more than an inline 4 cylinder and require heavier suspension components.

For example a 1989 Lamborghini Countach 25th weighs around 3200lbs which is slightly heavier than a 2022 VW Golf GTI (3150~)

I see comments that blame safety technology (electronic components) for increasing the weight of a car but a blind spot monitoring system probably weighs less than 5lbs. A rear camera is also around that.

Structural safety and airbags do add to a cars weight but these changes have made cars extremely safe.


arguably the biggest driver is simply the cost of oil. crumple zones made of styrofoam and plastic bumpers arn't making things heavy.

e.g. https://www.forbes.com/sites/samabuelsamid/2019/01/03/new-ve...


That article is being disingenuous and wrong. It's comparing the lightest possible Civic configuration with the heaviest possible Accord of a different body type.

The 2000 Accord sedan is 2,712lbs, not 2,987lbs (which would be the wagon).

The 2019 Civic sedan is 2,743–2,923lbs depending on equipment/trim.

So yes, the Civic compared to an older car of similar size did get heavier.

The Miata proves that cars don't have to be heavier, but the Miata also took advantage of much more aluminum compared to the older models. Maybe mainstream cars should also switch to use more aluminum to keep weight down, and you're right that the reason they don't is because oil is cheap enough where weight isn't a priority enough to use more expensive aluminum instead of steel.


A 2000 Honda Accord and a 2019 Honda Civic have nearly identical dimensions. Car models generally get bigger with each generation over time.


> That article is being disingenuous and wrong. It's comparing the lightest possible Civic configuration with the heaviest possible Accord of a different body type.

Good to know.

> So yes, the Civic compared to an older car of similar size did get heavier.

If the minimum is 1% heavier and the maximum is 2% lighter then I would not say "did get heavier".


You can only make the argument that the Civic is "2% lighter" when it is being compared to a wagon; apples to oranges comparison that invalidates the whole comparison.

They picked that specific year Accord because it's the same size as a sedan as that specific year Civic sedan, so it makes no sense to then compare the weight to the much larger Accord wagon variant. You might as well compare the sedan to a crossover to argue that the sedan didn't get heavier.

The range is 1% heavier to 7% heavier comparing the sedan to the sedan. Both ends of the range are heavier, so "did get heavier" is an accurate statement.


Okay I misread you then, but you're saying the 2000 Accord sedan only has one weight, while the Civic has a several percent range? If that's right then do we know which Civic trim is equivalent to the Accord?

If we know that trim is worth at least 6%, and we don't know how to align the cars, then the confidence interval around "1% to 7%" extends far enough to overlap some negative percents.


Another thing not mentioned by this poor article (everything forbes does these days is hot garbage), is that vehicles which are heavier do damage quartically proportional to their weight - https://en.wikipedia.org/wiki/Fourth_power_law

So the ever increasing weight of cars, trucks, SUVs, and especially semi-trucks is also responsible for our roads being shit, full of potholes, and expensive to fix.


Exactly because of the fourth power low, almost all of the road damage comes from the heaviest vehicles: class 7 and 8 trucks as well as buses etc. Even the heaviest passenger vehicles are negligible by comparison. And the weight of semi-trucks hasn't been "ever increasing": normal maximum weight has been fixed at 80,000 pounds for decades.

In some areas the roads are shit due to weather conditions, mainly frost heaves. This has little to do with vehicle damage.


Surely they do damage proportional to the fourth power of the contact pressure on the tire contact patch, not the fourth power of the overall vehicle weight, right? So adding axles or wider tires etc mitigates this.


Almost like changing the energy source causes a step change in energy optimization priorities . . .


To be fair, a lot of the tools we use as developers have 30-40 year heritage's themselves. The things that most people depend on and are in the background.


Relational databases are from the 70's


I don't know where I fit on that spectrum, my first thought was there's probably nobody around anymore to replace these fifty year old tools, and/or they'll price it at a level that would wipe all profits for the next 10 years when replacement will be needed.

Our field also have these IBM AS/400 or older running for 30+ years in a server room at the back of an office floor. They are more feared than revered.


I rather think of the maintenance nightmare. You can't change anything - not cause the existing system is good but because there are no people left that understand the whole thing.

But then I've got a few years to reach 50. Perhaps my views will change.


Every software company I've worked at that is more than 5 years old had major features that nobody understood anymore, even features that were core to the product.


Dont forget the critical software that keeps the company going that someone dealt with long ago, and was left to rot when they left, only for someone to discover it and have to go on an archeology dig to find info and improve upon it.


The gear cutting machines almost never had mechanical failures. As long as they keep them lubricated, and occasionally muck out the sumps, the machines should still be going in the year 2100.

The other thing about gear cutting is that hobs only cut one size/profile of tooth. Some of the cutting tools I was using dated from before WW1, for odd sizes that didn't get used much.


> for odd sizes that didn't get used much

Well, perhaps the long term solution is to rework it and start to use more common sizes. Standardization is a good thing in the long run.


This happens in tech as well. It's called COBOL and it refuses to go away, despite lots of people's best efforts.


The thing is, those 50 yr old machine tools might be still good, but the more recent CNC machines are much more efficient, and require way less manual dexterity to use (say, compared to a lathe).

This is the whole idea of industrialization - moving away from having skilled artisans, into machines that encode the skill to reproduce the article.

The fact that machines that are 50 yrs old are still in operation is quite a feat but also an indication that the production methods remained static (of course, if the production machines are good enough already, then investment into new machines don't bring in new profits).


As someone who grew up in a machine and wood working shop and now builds, repairs and retrofits machinery, I can say that 50 years old is nothing and absolutely fine.

> The thing is, those 50 yr old machine tools might be still good, but the more recent CNC machines are much more efficient, and require way less manual dexterity to use (say, compared to a lathe).

I assume you are referring to manual operated machinery vs CNC machinery? Otherwise there is little to no efficiency gained from a new CNC machine. I've run both and the setup of a CNC for simple jobs that can be done on a manual isn't worth the effort. CNC's really shine at high production and very complex parts.

> The fact that machines that are 50 yrs old are still in operation is quite a feat but also an indication that the production methods remained static

If the requirements haven't changed, e.g. machining flanges that meet ASME B16.5, and the production methods are already optimized, why even bring this up?

> (of course, if the production machines are good enough already, then investment into new machines don't bring in new profits).

Right. If the specs didn't change then why bother investing in pointless upgrades?

The ONLY reason companies toss out machinery: it's no longer useful to the company, or so hopelessly broken that it cant be fixed. And there is very little that can render a machine scrap unless something catastrophic happened. And there is very little preventing old machinery from being retrofitted with new controls.


I think it is not always about an age gap. A friend has a distribution company with many trucks and many times they need to use a manual machine, and soldering for fixing truck issues.


> There is a really interesting generation gap issue in the replies to your comment. What I perceive as younger people are horrified at the idea of fifty year old tools while the older folks are thinking (I imagine) “if the tools have lasted that long they must be well-honed and very good”.

It's like when someone wants to choose a brand new web framework that isn't battle tested over one of the most battle tested web frameworks. You can hire way more developers with battle tested tooling, than some bleeding edge thing you don't know if it can even scale.


> You can hire way more developers with battle tested tooling, than some bleeding edge thing you don't know if it can even scale.

There's a crucial difference!

You can hire expensive developers on battle tested tooling... or you can hire a shit ton of juniors that want to work on $BUZZWORD for cheap in exchange for the CV creds of "worked on $BUZZWORD".

Resume Driven Development...


> Resume Driven Development

You made me laugh


This is known as the Lindy Effect.

https://en.wikipedia.org/wiki/Lindy_effect


Roller coaster tycoon is good.

The business software I have to work with from the 80s is a straight up nightmare. And I'd say most old software is in this camp.


Well, most new software is a nightmare too. And recently a lot of it started to try to do the wrong thing by design, what's an extra step into nightmare scenario beyond anything from 80s.


Less interested in America's expensive and slow manufacturing than in Chinese processes. They can retool faster, handle far more volume, and except for specialized industries (medicine, aerospace) their quality is better.


How old are you? Do you not understand business cycles? Life cycles? 'Baggie pants are the future, anyone wearing tired out old non-baggy pants, it's over grandpa, for we have, for the first time, discovered baggie pants are cool and modern'

It's all fun and games when everything (your trains, your new bridges, your manufacturing process) is the brand new cool version. I went through that cycle (starting family, brand new house, new cars, new boat, high paying tech job working on early 2000s cutting edge tech). Our house was the cool house for a bit. Then is was just another house. Then is became a burden with maintenance expenses on top of the mortgage. I became a grey beard with legacy tech skills.

For China, what happens when things settle more? The market flushes out half the companies making the tools (there's tons of companies during the 'fill out' cycle, but at some point that slows/industry consolidates), or new product lines replace the old, now your factory is on borrowed time until the machines break down, you aren't the new hyped cool kid (I'm talking about you USA as a country/Ruby on Rail devs/Angular devs). Like with a new home, slowly your mortgage gets supplemented with appliance repair bills and other maintenance, what was cool and new is outdated and replaced with 'better'. China is doing good with robots, but what happens next gen when roboto/AI interaction is native built in. Does China scrap their entire 2025 robot infra and replace it with 2030s? Or does someone else gain the advantage of not having those 'legacy' slower to retool, slower volume Chinese 2025 robots/infra?

It's wild how smart people don't seem to understand basic cycles anymore. China is in a growth cycle and everyone is going to their 'new home party' and saying man this is all great (it is awesome. It is amazing how China went from poverty to current success, so many improved lives I love it) and we are acting like this is the first time someone bought a new home ever and the home will always be perfect and new and cutting edge.

Let's just celebrate that so much poverty and suffering has been eliminated in China, and wish them well and continued improvements especially as they transition from the generations that suffered to their newer generations without all that trama. That is also tricky to navigate for a society as life expectations become wildly different.


i see both sides. while some core tech made decades ago might be tedious to adapt to today's needs, i HATE the fact that modern code is not designed to be sustainable.

<rant> speaking of ai, there was a startup acquired by google in 2017, whose core features remain unchanged. however, their sdk and branding got switched around every 18 months or so. incredibly annoying how you cannot run the same code anymore despite having the libraries cached because of the cloud and its moody deprecation cycles.

recently had a similar situation with azure and their (yet another) copilot rebranding had existing work being phased out by the end of the year, when the actual sdk did not get changed besides the package name! </rant>

having said that i have come to appreciate things that were well-designed and lasted that long. change is not bad, but only when it is not just for the sake of it.


Yeah I have students who ask me "Why are we learning C++? My dad said it's older than him!"


Ask that question in language arts, and you'll get some very weird looks.


That's pretty much what I told the student, along with "everything you're studying in every other class is also older than your dad, and that's why you study it."


I'm a nearly 50 year old tool myself.


My first thought is: "when that wears out or fails what the fuck are you replacing it with".


Something newer. The reason that 50-year-old machine tools are still around isn't that they can't be replaced. It's that there's often no reason to.

To use OP as an example, in a lot of places, you'll find an ancient milling machine or a lathe that's dedicated to running a single job a few times a year. The machine was depreciated decades ago, but it can still do that job and there's no reason to get rid of it.

What modern tools give you is speed and flexibility. Many shops need neither.


What do all these machine shops without any need for modern machinery and processes actually do?

Seriously though, of course you can make a living with old tools - however, even the village metal workshop around here has at least one big-ass laser cutter and a CNC mill next to all their old(er) lathes, mills, brakes, presses and other toys. Many oldschool fabricators I spoke to over the last few years are quite interested in what laser welding brings/will bring to the table. Basically all smaller fabrication companies I've seen (the long tail of the car industry and other bigger industries, mostly) are continually upgrading their infrastructure with all sorts of robots and other automation widgets. And so on.


No one said that they had no need for modern machinery. It's an "if it ain't broke, don't fix it" approach. If you have a manufacturing process that was dialed in perfectly 20 years ago, and your customer(s) is still buying those parts, made on that machine, there is no benefit to moving them to another machine that now has to be set up just right, have the new parts coming off it QC'd to make sure that they are identical to what came off the old one, etc.

It's work that you don't need to do and that you won't get paid for. If the old machine breaks, then maybe it would make sense to move the job to something newer.

I used to work with someone whose entire business was retrofitting old machine tools with modern controllers when the decades-old electronics failed. You'd be amazed how much of this stuff is still out there.


Well, you kind said that literally. And I did not say that one should needlessly move processes to different infrastructure without a good reason. Anyway, I don't think our opinions are very dissimilar.

btw: I think I have a reasonably solid idea of a range of fabrication environments, the oldest piece of machinery I'm responsible for in my professional life is about 70 years old (its basic design is decades older) and some of my personal stuff (sewing machines, mostly) is more than 100 years old. I'm really not against using what works at all.


> What modern tools give you is speed and flexibility.

Many of the modern tools can also be grafted onto the old tools. Not just CNC conversions but the biggest productivity boost when I worked in a shop was converting everything to a zero point clamping system.


I had not heard of zero point clamping. This video I found is extremely cool, I can see how that would save a lot of time and money in the shop!

https://www.youtube.com/watch?v=rrLri_RdgK8


A shockingly rare question to be asked. As best as I can tell, the biggest threat to civilization isn't AI, it's our culture of "that's not my problem" leaving otherwise patch-able holes in critical systems.


The biggest threat to civilization is, in fact, AI. It's a new problem, and one with an utterly ridiculous lethality at its limit. It makes the atomic bomb look benign.

"That's not my problem" is something humans have been dealing with since before they mastered the art of sharpening a stick.


Nah, it's global warming.

Because fossil fuel is stupid useful and there's no way we're going to stop burning it. And then we get to the climate scenarios that aren't compatible with our current sophisticated civilization, even with the currently accepted climate science (that always seems to underestimate what actually happens).


Global warming just isn't harmful enough to pose a credible extinction risk.

The damage is too limited and happens far too slowly. Even the unlikely upper end projections aren't enough to upend the human civilization - the harms up there at the top end are "worse than WW2", but WW2 sure didn't end humankind. At the same time: the ever-worsening economics of fossil fuel power put a bound on climate change even in a "no climate policy" world, which we are outperforming.

It's like the COVID of global natural disasters. Harmful enough to be worth taking measures against. But just harmless enough that you could do absolutely nothing, and get away with it.

The upper bound on AI risks is: total extinction of humankind.

I fucking wish that climate change was the biggest threat as far as eye can see.


Climate change could lead to a massive world war over arable land and potable water. It could also make wildfires more common and more damaging. It will make cyclonic storms stronger. This may not be extinction level, but could be a major pressure on population numbers.


AI is less of an extinction than climate change. Especially what passes for AI these days. Climate change is going to displace billions of people. That alone is going to be chaos, but food/water shortages are going to be a problem too.

AI is only a threat if we suddenly reach sci-fi levels where AI concludes that the earth is better off without humanity on it and there's zero indication that we're anywhere near that today.

The good news is that our society will likely collapse or require resources pulled away from power/water hungry AI datacenters well before we see any actual I in AI


I've seen this sentiment shared before and I just don't get it. What is the logical progression of "AI" to "more dangerous than the atomic bomb"?


Humans are dominating the environment by hopelessly outsmarting everything in it. Applied intelligence is extremely powerful.

Humans, however, are not immune to being hopelessly outsmarted themselves.

And what are we doing with AI now? We're trying to build systems that can do what human intelligence does - but cheaper, faster and more scalable. Multiple frontier labs have "AGI" - a complete system that matches or exceeds human performance in any given domain - as an explicitly stated goal. And the capabilities of the frontier systems keep advancing.

If AGI actually lands, it's already going to be a disruption of everything. Already a "humankind may render itself irrelevant" kind of situation. But at the very limit - if ASI follows?

Don't think "a very smart human". Think "Manhattan Project and CIA and Berkshire Hathaway, combined, raised to a level of competence you didn't think possible, and working 50 times faster than human institutions could". If an ASI wants power, it will get power. Whatever an ASI wants to happen will happen.

And if humanity isn't a part of what it wants? 10 digit death toll.


Even if LLMs don't become AGI (and I don't think they will), LLMs are potentially superb disinformation generators able to operate at massive scale. Modern society was already having difficulty holding onto consensus reality. "AI" may be able to break it.

Don't think "smart human". Think about a few trillion scam artists who cannot be distinguished from a real person except by face to face conversation.

Your every avenue of modern communication and information being innundate by note-perfect 419 scams, forever.


I think we'll adapt. At any point we can start to treat the internet like the trash pile of bullshit it's been turning into and stop taking anything in our inboxes as legitimate and websites as nothing more than entertainment.


This hierarchy-of-intelligence concept is meaningless.


You might as well have said "intelligence is meaningless". Obviously false.


No, I might not as well have said that. Let me ask you this: huh?


The logical progression to me is AI acting in its own interests, and outcompeting humans much like humans outcompeted every other animal on the planet.

This is particularly threatening because AI is much less constrained on size, energy and training bandwidth than a human; should it overtake us in cognitive capabilities within the next century, I don't see a feasible way for us to keep up.

You might argue that AI has no good way to act on the physical world right now, or that the current state of the art is pathetic compared to humans, but a lot of progress can happen in a decade or two, and the writing is on the wall.

Human cognitive capability was basically brute-forced by evolution; I think it is almost naive to assume that our evolved capabilities will be able to keep up with purpose-build hardware over the long run (personally, I'd expect better-than-human AGI before 2050 with pretty high confidence).


> "That's not my problem" is something humans have been dealing with since before they mastered the art of sharpening a stick.

Yes, and that's why civilizations have kept rising and falling throughout history.


You machine a new part.


Exactly: the new process becomes the lowest bid, new NRE goes forward, and the old process finally gets to drift off into the sunset. It happens all the time and people (sometimes the same people) complain about that too.


Sourcing via eBay.com


As an older folk, I perceive 50 year old tools as likely worn out and in desperate need of replacement. However often nobody makes the tool anymore and so we are willing to spend a lot to maintain them instead. (I drive a 25 year old car - this is only possible because I can get a rebuilt transmission, but my maintenance costs over the last 10 years would have bought much newer/nicer used car, and I'm getting close to where I could buy a new car)

The other possibility is the tool isn't used much and modern accountants would never allow you to buy it in the first place because of all the cash tied up. (that is the work the tool did over those 50 years wasn't enough to pay for the cost of the tool and the space to store it)


I was recently talking to the head clockmaker at the Chelsea Clock Company, one of the very few, if not the only, remaining original New England clock companies still operating. He showed me some pictures of clock making tools being used during WWII, and then the very same tools in perfect shape still being used today. He also had one tool that dated back to when they were the Boston Clock Company (circa 1894) that was still in active use. In this new world of disposable tools, it was pretty neat to see.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: