Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Samsung to jump into laptop processor market with Exynos chip in H2 (kedglobal.com)
309 points by cbg0 on May 10, 2021 | hide | past | favorite | 294 comments


I doubt Samsung is gonna come even close to Apple in performance. But perhaps getting decent ARM processors in laptops will mean we get battery life that is actually good.

It's hard to believe that you can use a mobile phone with a high resolution screen and get 8 hours SOT with a varied workload. But browse the web in Chrome on a 2020 Intel or AMD based Windows laptop with a 1080p screen and see that your battery life doesn't even come close. Despite a much larger battery.


I'm confused. I easily get 8-10 hours on my Core i7 laptop as long as I'm not at 100% CPU usage permanently. Modern Ryzens also hit this easily. So I don't know why 8 hours screen on time on a tiny screen with an incomparably weak CPU is supposed to be proof of how much we need ARM.

Apple's M1 is in a class of its own, unfortunately.


I’m also frequently confused when people refer to Windows laptops as in a different league of battery life. Modern Windows laptops do just fine. M1 is a cycle ahead, but the Ryzen mobile parts are actually very good too.


Lots of people buy 15 inchers with dedicated graphics etc. Or like... run Linux (which hoses the battery on a lot of stuff if you just use it out of the box without tweaking stuff).

Also really cheap laptops aren't as great about battery life just cuz of reasons. I think there's a lot of selection bias (like people who have a lot of money will buy macbooks in general cuz that's what everyone says to do)


I've used various distros on my recentish Dell laptop. I get the advertised 8-10hrs without configuring anything.

It just werks.


I configured my Thinkpad X280 under Arch to run at roughly 3 Joules/second. Which gives me way over 12 hours of active use time.

Linux is far superior to windows in this regard.


Unless, unfortunately, you've got switchable nVidia graphics. As of last year, at least, without some real gross stuff you're looking either at awful multi-monitor performance on AC power (because the dGPU is permanently switched off) or awful battery life (because you're using the nVidia chipset and it eats batteries for breakfast, lunch, and dinner).


Nvidia's Linux support story has been crummy for the last half decade, if you bought a laptop with an Nvidia dGPU and are expecting it to work efficiently in Linux then you need an expectation reset.


Yeah, for graphics it sucks. OTOH, for CUDA it's literally just plug-and-play (at least on Ubuntu 20.04).

Makes a nice change from five years ago, when I kept breaking my display in order to make CUDA work.



> Joules/second

aka Watts


To be fair the laptop is rated for 16.5 hrs.


Yes, in some weird lab setting with 12% screen brightness and no user activity.


a friend got X250 with no tuning to work continuously for something like 26 hours.

But thinkpads didn't tend to remove batteries to get slimmer then.


how's the keyboard compared to 2015 macbooks?


It's very much OK. It's different but it's still good. I prefer the MBP (provided 2015 is still the old non-butterfly mechanism) but I would have no problem with switching 100% to the TP keyboard.


Why isn't Linux configured to better preserve battery life on laptops by default? Power usage has been an issue for over a decade. If it's just a matter of a few tweaks, surely it could be done?


It's not a matter of just a few tweaks. It's a few new tweaks each generation, as Microsoft and Intel keep changing their minds on what the "right" way is to put devices into low-power modes, and then not thoroughly documenting those changes let alone upstream them for inclusion in standards documents.

And then there are the hardware bugs which require workarounds in firmware or drivers, and the firmware bugs which require workarounds in drivers, all of which are only developed and tested against Windows.

The set of power management options that Windows 10 exposes to the user has been steadily dwindling to the point that on a new laptop you basically only get to customize how long before the screen shuts off and how long before it goes to sleep. All the more detailed options you had in early Windows 10 or in Windows 7 are no longer exposed, because there's no consistent way to map such controls onto the ever-shifting set of underlying platform features.


Distros don't even automatically install and configure TLP when they detect an internal battery. So yeah, it is 'Linux' that is the problem on laptops.

That and automatically activating a 'small speaker' EQ for when a device is detected to be portable / be a laptop and have internal speakers would be a massive user experience improvement for laptop Linux users.


> Distros don't even automatically install and configure TLP when they detect an internal battery.

TLP isn't magic. More than half of its documented options are inapplicable to current hardware or kernels, and a fair number of the remaining options that could still have an effect are not safe for distros to ship as defaults, usually because they'll trigger firmware or hardware bugs. Installing TLP by default would be far from an actual solution, and isn't even that great a first step toward solving platform power management inadequacies.


A few tweaks for each different model of laptop. Consumes a lot of volunteer effort very quickly, that.


And some of the tweaks can cause instability or data loss on different models with different quirks.


Mostly speculation, but I think a big part of it is that MacOS will heavily throttle as it needs to, and I imagine that Windows laptop drivers get a lot of love in that space too since the manufacturers want to get good battery life in their machines as well.

I'm sure a huge component of it is just people-hours spent in figuring out the right balance of defaults that don't just mess people's setups up.


It's up to each distribution to provide a kernel build + configuration that suits some specific use-case.


Pop OS has it's own power management that is pretty decent. I can almost get through a whole day of work without plugging in.


I suspect it has todo with getting decked out machines. If you've got the >1tb ssd and the >=32GB RAM model, you will see your OS simply use the ressources for caching and speeding up the machine, which in turn will use more battery.

At least that's my pet theory, running an 4750U with 64GB of RAM installed ;)


The RAM probably uses more power just by having more installed, but it shouldn't be causing more expensive work. If anything, being able to use RAM cached content should allow the SSD and its IO channels to idle more often. A "decked out" laptop might also have a more powerful GPU with nowhere near as good idle power characteristics.

But, the largest differences are probably in idle power management, with the phone having much lower power IO channels and better zoned power management to really reduce idle power. A laptop often has the system in a much higher power level with off-chip resources powered up, including WiFi, screen refresh, data buses, and backlight driving a much larger screen area.

Edit to add: I've noticed on laptops with Linux that vastly different battery life from "similar" machines come from differences in how the software and firmware interact to reach different powersaving modes for screen on but idle states, such as with a browser open on a page that has already rendered and awaits user input.


That’s not quite accurate. When you’re talking about standby time in suspend-to-ram, RAM refresh is actually a non-trivial battery cost because it’s basically you’re only active power draw remaining. Which DIMMs have active memory pages is a partial function of memory usage.

So if your kernel woke up (let’s say after 30 min of suspend to ram) to try and see if it can reclaim physical DIMMs and mark them unused (shrink caches, reorganize pages, etc) you could get quite a big win on standby time. The trick is how to do this without actually hurting battery life (waking up can be expensive and there’s no guarantee you’ll be able to free up DIMMs) and preserving wake from suspend performance (if you’re phone is laggy on wake that’s an experience users will switch away from + you could eat your entire power savings paging back in).


Is the memory controller aware of unused pages in physical RAM that thus can be safely skipped during refresh? That would surprise me. In stand-by the self-refresh mode is used, and I believe partial refresh is only supported by the low-power (LPDDRx) standards and is very coarse-grained.


No. What I heard proposed was just at the DIMM level. The memory controller has no concept of memory usage.


The refresh costs you are talking about are what I meant by the RAM has cost by being installed on a typical laptop. The power is used because the machine is in an active-idle power state with that much RAM installed, not because the OS has decided to burn more CPU time in the presence of more RAM as someone suggested earlier in the chain. Most of the idle power consumption is all the ancillary system controllers, not the CPU cores themselves. These are the things that are better managed on a typical smartphone SoC platform.

And, I am referring to active idle, not suspend-to-ram scenarios. I think people are talking about screen-on time for battery life, not lid closed and suspended nor lid open but screen disabled. Precious few people are enjoying a laptop that successfully performs a suspend to RAM sleep state while keeping the display alive. Often, the integrated GPU is continuously scanning framebuffers in system RAM and outputting pixel data to the embedded display port even though nothing is changing on screen. Frequently, drivers have disabled LCD panel self refresh modes due to unresolved glitches and artifacts in the graphics stack.


Yup. One of the major reasons cell phones outperform laptops on battery life is because they go to suspend to RAM aggressively. I think a similarly aggressive mode would be needed for laptops on battery power to compete effectively (& might require OS optimizations to make the wake scenario to be as instant as it is on mobile).


It's a reality distortion field and maybe effect of not blocking OSX's default "go to sleep the moment you stop looking at it" policy. Macs were stuck at (marketing) 10h battery lifetime for so long (for comparison - Xeon mobile workstations with few times the power regularly hit 10h on battery without tuning), that M1 feels like huge jump out of nowhere.


Yeah. Consumer CPU power draw is largely a solved problem. Total aggregate processor energy use on a typical Intel laptop is already down to somewhere around 20-30% of what the display backlight pulls. That's good enough, you're just not going to do much better on a laptop form factor.

The M1 isn't really that notable for "battery life", it isn't. All the people raving are people hitting particular edge cases of high CPU utilization that consumers (even developers) generally don't see when browsing and watching. The Apple power magic is all happening in phones.

And the magic of the M1 is that they have achieved desktop-class (nearly market-leading) performance in a chip that still draws like a phone at idle. It's an amazing piece of engineering, but in a laptop it's really just an incremental improvement over what we already have.


> Apple's M1 is in a class of its own, unfortunately.

We want companies to excel and give a stiff competition to their peers. Now that we have M1, it's pushing the entire floor to the next level - pushing Intel, AMD and the entire x86 ecosystem.

Curious, why do you think it's unfortunate? Is it because Apple is a large company?


Not the parent, but: it isn't unfortunate that the M1 is so good; it's unfortunate that everything else is so far behind.


Is it that far behind? Benchmarks aren't everything, but it's probably as good as other metrics. A14 is ahead of SD888 in Geekbench (which has historically favored Apple CPUs), and behind on Antutu (which is more considerate to Snapdragons).

Geekbench 5:

- Single core: A14 1609, SD888 1127

- Multi core: A14 3872, SD888 3731

Antutu:

- A14 615796, SD888 723674


Benchmarks aren’t everything as you said. I used to have the last gen MAcBookPro with Intel for 3 years, and replaced it with M1 MacBook Pro. It’s almost an apple to apple comparison as they are same spec except the new processor. The difference is night and day. It just simply feels faster to use. It last almost 1.5 workdays of my use, while the intel one would not even last one full work day. I don’t need to lug around a power brick anymore. I charge at home and bring only the Mac to work without feeling anxious about battery. The cost I pay is that building x86 Docker images are quite slow on M1, but I don’t do it that often to bother me much.


> and behind on Antutu (which is more considerate to Snapdragons).

It is more considerate to Multi Core workload.

The main point is Single Core benchmarks. Which A14 set itself apart to all of its competitors.


Note that M1 is faster than A14:

Geekbench 5:

- Single core: 1744

- Multi core: 7676


The quoted comment referred only to battery life.


If Samsung just commits to outspend Apple on the node commitment, transistor counts, and size, then the outcome of the race may not be so pre-determined yet.

Transistor for transistor, ARMs latest licenseable cores are not so much far behind.


Last time I checked Samsung tablets and phones, they lagged already in the store. I don't know whether it's all the crapware that gets pre-installed, or just inefficient programming. But I think CPU speed might not help them.


Their Exynos are infamous for being slower, running hotter and using more power than equivalent Snapdragon and even Kirin (and of course, Apple SoCs).

But I doubt that shows in light usage like scrolling or opening an app, that's more likely their (lack of) software optimization.


My experience is that Exynos hw was great... but Samsung software (including drivers) was problematic.

Combine to with legacy of android design going for flexibility, including allowance for inefficient options (especially important in comparing graphics - Several Apple models ran on edge of being able to paint one frame without stutter assuming well-optimized code - Apple spent a lot of time ensuring you didn't see that) and you get certain reputation.

The only thing that I noticed really problematic is graphics intensive software optimized for Qualcomm.

Also, the real issue is not that Samsung doesn't have the capability. A lot of Apple "secret sauce" is that they don't have contractually separated design teams that have to "shop" around for suppliers/buyers, which means that both Qualcomm and Samsung are forced to make more mediocre CPUs because it brings wider selection of buyers - whereas Apple can design device with SoC together, which let's them easily take decisions like "ok, let's put A LOT MORE L1/L2 cache on each core" because they aren't going to deal with customers not wanting to buy them.


But Samsung has its own huge mobile business. Wouldn’t it be justified to make an OP SoC for their Galaxy Tab(or whatever they sell)? I’m sure their execs are aware of the possibility.


Supposedly they are forced to keep a "chinese wall" strategy due to possible legal concerns. How true that is hard to check, but there seems to be considerable variation and definitely less "design this cpu specifically for this phone" in Exynos line up, except maybe for first Samsung Galaxy S and SGS2.

P.S. Exynos and Apple A-series have common ancestry


Outspending Apple seems like a tall order, to put it mildly.


probably not when you're the second largest


You would need to not only outspend them, but at least match their engineering. Either one alone would be a tall order. But they should at least try, consumers will benefit even from a close miss.


Not OP, but Apple doesn't sell their chips to other companies. It's their right to do so, but if Apple sold M1, that'd be great for consumers.


M1 is that good because Apple customises their software for it(or the other way around). I doubt any other company would dedicate that much effort to it.


M1 is quite good at general-purpose workloads.


Does the processor really dominate power consumption in modern laptops so much that the M1 chip largely explains the long battery life in new Macbooks? The data doesn't seem easy to come by with a cursory web search.


I was considering Zephyrus G14 at some point, so I visited the dedicated subreddit. There was a guide on getting it to ~8 hrs on battery. Man, I lost my enthusiasm right away. Custom fan management, getting rid of bloatware...


It's very unlikely given the difference in mobile performance, but it might mean that Samsung is going to seriously invest in ARM performance going forward.

The latest Samsung mobile processor (Exynos 2100) hits 956 single-core and 3,151 multi-core for Geekbench. That's basically the same as the Qualcomm Snapdragon processors or maybe a bit slower. Apple's A14 hits 1,585 single-core and 4,214 multi-core (66% and 34% faster respectively).

I would note that the Exynos still has good numbers. A 2020 MacBook Pro with i5-1038NG7 pulling 28W (not the cheaper 15W MacBook Pros) only hits 1,143 single-core and 4,227 multi-core. While the Exynos doesn't quite match an $1,800 laptop pulling 28W on single-core performance, the Intel is only 20% faster. Plus, on multi-core performance it is matching. If Samsung created a 10W or 15W part, it seems likely that they could beat that Intel processor.

And one can nitpick about Intel having released an 11th gen processor or looking at AMD performance or any number of things. However, it seems reasonable to believe that Samsung is definitely within striking distance of Intel's laptop performance without too much difficulty and certainly within what people want in terms of laptop performance.

If there were no worries about emulation of x86 instructions (since a lot of Windows software may never be ported to ARM), I think Samsung could easily satisfy people's performance needs.

Ultimately, Apple isn't really a competitor if they're not going to sell their M-series chips to Windows laptop makers. Maybe people will start buying Macs based on their performance, but generally people have bought Macs because they want the Apple hardware/software, not because they could get better performance (and even stuck with Apple when they would have to suffer through miserable performance circa 2004).

Samsung doesn't need to match Apple. They need to match Intel and it looks like they're there. If they can offer something at a lower cost with better thermal properties, that's a huge win. I think the bigger issue is going to be that so much Windows software won't get ported over. On the Mac side, everyone knows that they have to port everything over. On the Windows side, I'm guessing there are a lot of developers that doubt everyone's commitment to Windows-ARM. Plus, part of the Windows ecosystem is being able to run all sorts of stuff - something Mac users don't expect.


M1 and Rosetta shows that it is possible to emulate x86 and still run that faster than Intel for some workloads. Nothing prevents Samsung and Microsoft from doing that with Windows on Arm. And they do not need to match Apple efficiency, stuff like CPU 50% slowdown should be OK for most business and education oriented software as long as for native applications the performance will match roughly the latest Intel CPU.


I'm guessing a lot of that has to do with software that gets preinstalled on Windows machines. There was a article the other day about how bad it is releasing laptop in 2021 with only 8GB of RAM. I noticed people in that thread talking about how their laptops after startup are using 6-7GB of RAM , which means they must have a lot of processes going on in the background. I'd imagine cutting down all of the unnecessary background processes would really help with battery life on laptops.


8GB is fine, the problem is all laptops are still shipped with 5400RPM harddrives. My mother bought a new $700 Inspiron from Dell, it's made of janky plastic and took 5 minutes to get to desktop and longer to do anything else. Replaced the 5400rpm drive with an SSD and it was plenty fast.

Oems are sabotaging themselves to get their customers to buy their $1200 laptops - which at that point people are instead buying macbooks.


As long as you’re ok with macOS, there simply is no competitor to MacBook Air for the needs for 90% of people, and there hasn’t been for nearly a decade in my opinion. It’s cheap, lightweight, fast, long battery life, and has a tried and true design since many years.


> for the needs for 90% of people

You could say the same thing about a Chromebook. And they are half the price of a Macbook Air.


Google’s propensity to kill and change things every year is reason enough to not deal with their products. I certainly would not recommend it to someone who just wants it to work and get out of the way, and does not want to keep up with news about changes Google is making.


That's true for many Google products, I've been burned myself by Daydream abandonment, but Chromebooks are widely used, provided by many companies, some have an 8 year support policy, and Google bought a company who's business is to support ChromeOS on legacy devices. So it's much bigger than one of their trial balloon products. I've been recommending them for years to different people who don't need a "full OS," and they've all been very happy with them. It's actually much less than half the price of a Macbook for a decent, IPS screen and even convertible/hybrid Chromebook like the Lenovo Duet. I find a lot of pushback from Windows enthusiasts, Apple people are more likely to suggest an iPad which I think is more reasonable, but for many activities a Chromebook is a great choice, and you can buy one and a handset for the same price as an Air.


Chrome OS has been around for a good while.


Older people need 17" screens.


Older people may not need laptops. Especially if they do not work. My grandparents just have a desktop.


I am in my 40s. I have spent 99.9% of my personal computing life not using a laptop.

I have zero interest getting one for the forseeable future.

Work assigns me one. It sits on my desk at work in the dock. It could be a desktop but company policy is "everyone gets a laptop".

shrug Each to their own.


I have 32GB RAM. 99% of the time I am under 8GB. While playing RAM intensive games like Minecraft 16GB is ok most of the time. However, I bought 32GB simply because that is how much I could get for 100€.


Android phones have a lot of crap preinstalled, too. Android tends to be much more ruthless about automatically terminating unnecessary processes, though.


Graphics also matter. My 6yo Inspiron let me develop with 8GB of RAM, last year I bought a new one with Ryzen5 set the very same link setup and even used the same SSD but it would began to freeze or crash...

The APU was reserving 3GB of RAM for the GPU. The next day I went for 8GB more.


Agreed, the last 4 laptops bought by my family, the ones with dedicated graphics are still going strong, but the ones with integrated intel graphics aged poorly despite having better CPU perf.


I don't think you understood what your parent meant. In their case the GPU was detrimental cause it was reducing the ram available to the rest of the system to 5gb.


APUs are integrated graphics. Only integrated graphics share system RAM. Dedicated GPUs get their own RAM.


This was the problem that surface laptops set out to solve by not preinstalling the crap that Microsoft hates (but can’t stop OEMs shipping). They were clearly at least partly successful but I guess they can’t get into the low end well enough.


In theory though, if that stuff is mostly dormant, it should eventually just get paged out and consume neither significant RAM nor CPU time.

Of course, that depends on that software not being horribly miswritten to wake up all the time.


Supporting that theory, I find that Chromebooks can get by with less RAM than their Windows equivalent (even when the Windows machine will only be used for the web, which is pretty common these days in my experience).


Samsung's custom cores haven't been competitive with Qualcomm so far, much less Apple.

You only have to look at comparisons between the Surface Pro X and the M1 Macbook Air to see how far behind Qualcomm is.

https://www.youtube.com/watch?v=OhESSZIXvCA


Even more damming, the future Surface pro X, with updated Qualcomm chip, will still be slower than windows in a VM on M1 Mac.

https://www.windowscentral.com/leaked-snapdragon-processor-c...


They're not using their custom cores anymore. See the ARM X1/A78/A55 arrangement used in the Exynos 2100. They licensed AMD graphics IP to use that as a differentiating feature.


They still use substantially less cache than Qualcomm.


Maybe so, but in cache is one part of the overall story. If you look at benchmarks run on phones with both types of processor, Qualcomm is ahead, but the 2100 is nipping at their heels.


The latest Exynos chips are pretty competitive with Qualcomm. Samsung has closed the gap considerably in this generation and even outperforms Qualcomm in some respects.

https://www.gsmarena.com/exynos_2100_vs_snapdragon_888_bench...

In the long run having their own fabs is going to give them the upper hand in this battle I suspect.


Differences between them start at 4:20 https://youtu.be/OhESSZIXvCA?t=261


They're comparing a dev-channel Windows feature to a stable MacOS feature. And they compare an emulated Chrome benchmark against a native version of Chrome. The Windows feature doesn't even run some apps, which is a clear sign that it's not optimised.

It's obvious that M1 is a lot faster, but all the theatrics and data in this video doesn't make up for the fact that this is not a good comparison.


Newest gen Ryzen laptops are different. They're significantly more energy efficient than older AMD and any Intels. The energy efficiency of new AMD processors is something like 50% higher than Intel.

Couple other things working against laptops. The screens are big and usually not OLED. They use forced cooling. And on cheap laptops the batteries are real tiny. As little as 40 watt hours, around 2.5X the size of phone battery.

Energy efficiency of new Ryzen is probably better than Android ARM CPUs, similar to IPhone or M1


Is energy efficiency actually similar to M1? I'm not aware of any Ryzen-based fanless laptops


Lenovo IdeaPad 1 is the one that I could find, but at the $350 price point needless to say it is not a flagship product. Main issue right now is that the 6W SoCs are about a generation and half behind the regular notebook APUs (which are themselves 0.5 to 1 generation behind the latest Zen in desktop CPUs). Not getting the latest and greatest means that it's highly unlikely anyone would want to bother designing a top-of-the-line product around it.


I'm fairly sure it is. Apple does have more experience with fanless SoCs though


Is this a problem with processors, or with Chrome?

Even on Intel MacBooks, battery life while using Safari is far better than while using Chrome.


Chromebooks do pretty well, apparently? Maybe it has to do with the amount of integration with the OS.

> The average Chromebook lasts 9 hours and 58 minutes, according to the data we’ve collected on the Chrome OS devices we’ve reviewed thus far.

https://www.laptopmag.com/best-picks/chromebooks-with-the-lo...


I have a Chromebook running Linux that can get 7-8 hours on web browsing; integration might be part of it but I don't think it's a major one.


In some cases it's just the inevitable effect of the OS vendor being the browser vendor.

For example, DRM video in Safari is decrypted in dedicated low power hardware as the DRM system is in the T2 or M1 chip, because Apple built FairPlay into the hardware.

Chrome uses Google's own Widevine, which isn't implemented in hardware, so all the decryption operations are done in software on a Mac (with a subsequent hit on battery life). On a Chromebook Widevine is done in hardware, so the battery is better (and you get trusted with better resolutions by content providers because it's more secure). And if Apple were willing to port Safari to Chromebook's it would suffer an about turn in performance penalty.


I think it's probably more Chrome's fault as Safari is known to be more optimised for battery. Likewise old Edge was pretty good for battery too.

I know that if I use the native Windows Store based Netflix app on my laptop (Ryzen 4800u) I get much much longer battery life than if I use Netflix through Chrome. But it's still a long way off what you'd expect from a device with a battery so large compared to a phone. I think the M1 Mac vs Intel Mac battery life comparisons showcase the difference in CPU efficiency the best.


> I know that if I use the native Windows Store based Netflix app on my laptop (Ryzen 4800u) I get much much longer battery life than if I use Netflix through Chrome.

The native app uses hardware decoding, while the browser one falls back to software decoding, so this is not surprising


In my experience, Safari (and old Edge) were more willing to let pages crash or limit CPU cycles for the sake of system performance, while Chrome is content to let pages allocate as much RAM and use up as many CPU cycles as they want, letting poorly behaved websites work while sacrificing the rest of the system and battery life.


It's processors. x86 architecture is not as efficient as ARM based, which is why ARM is used in embedded low power devices.

Like the common adoption curve, Intel viewed ARM as a toy, that could never compete with their power capabilities. But ARM got better and better every year, and though x86 got marginally better, they weren't being driven by a market that demanded higher performance. ARM caught up, and due to some exceptionally clever engineering by Apple, has surpassed the capabilities of x86.

We've seen the same story many times before, electric cars will never be competitive, online shopping will never compete with the in store experience, solar/wind/etc will never compete with coal, digital cameras will never compete with film....etc etc

Add your own, there are plenty of examples, but I'm failing to remember others right now.


Apple caught up and then surpassed x86. ARM designs are a joke. Compare Surface Pro X to an M1 Mac.

If the ARM ISA was inherently so much better it would be easy for Qualcomm and Samsung to release top performance ARM laptops, but they can't.

Apple is so far ahead of everyone right now.


Apple's M1 is an ARM processor that has two major benefits over ARM based PCs.

1) Rosetta is FAR more powerful than any translation app on windows. This means that M1 is able to run Mac programs that were built for macOS much more efficiently than windows apps running on ARM. It's amazing how much this piece of software does in making this a smooth transition for apple.

2) The end-to-end pipeline at Apple has let them define the first M1 chip limitations in graphics and memory, and with this, and the shared memory and quick memory access, they're able to blow anything from Qualcomm out of the water. The only way Qualcomm could compete with that chip is if they dictated to the customer "this is the build you're getting, you can't upgrade RAM, or use a different GPU, etc etc. That's just not how they operate.


It is not only Rosetta. M1 has special hardware to allow for faster emulation like supports for Intel cache coherency protocol allowing to emulate multithreaded code with much less efforts.


It is not just Rosetta but the compiler toolchain, the long experience with iOS libraries running on ARM providing an almost identical API, multiple rounds of binary architecture transitions via fat binaries in combination with the app store.

Having said this Samsung has just to find a few buyers and kick start the software development in this iteration. Whether they can build momentum depends on the next model, on their software/hardware partners and on how committed they are.


> If the ARM ISA was inherently so much better it would be easy for Qualcomm and Samsung to release top performance ARM laptops, but they can't.

They simply don't really want.

Gets just any top ARM SoC from both, and scale caches, and bus widths to M1's size. The advantage will immediately stop looking so dramatic.


But that is an argument against ARM and an argument in favor of any processor that is being designed by a large company with lots of capital and good engineers. The ISA doesn't even enter the conversation.


I doubt it matters to Samsung.

They are not Google, or an American company for that matter, where the mentality is win at first try or die.

They will try to erode the market slowly but with constant grow, learning from other companies mistakes and having better fabs and better logistics.


That is Samsung's MO, but it hasn't worked especially well with Exynos in the past. Part of it is that Samsung SLSI doesn't have the kind of support structure that Intel, AMD, Qualcomm, Mediatek, or even RockChip have to enable customer designs.

The bigger issue though is that just about every OEM has been burned at some point by relying too much on Samsung. You can work around that with memory and to some extent displays, but taking that risk on a CPU is rough.


It seems to me M1-style performance should be easy to achieve for other large companies with access to decent foundries... one would think Samsung could just use a similar ARM core, then study the design of the M1 to come up with a competitive cache design. (but I'm no expert, happy to have someone correct my view with better information)


If that were the case Apple couldn’t have maintained a huge performance lead in smartphone CPUs for half a decade.


Excellent point. Also Apple has a unique advantage of hardware + OS all built under their control and carefully optimized.

Regular chip companies can't influence their customers as well as Apple's hardware team influences the OS departments.


They had a lead because they were optimizing for a different target than Qualcomm. Apple was more concerned about single core performance while Qualcomm was focusing on lower cost and efficiency.


1. There was no similar ARM core until recently.

2. Samsung/Qualcomm/etc have lower prices and lower margins than Apple so they can't afford to make chips that are as big as Apple Silicon. For example, the M1 has four big cores but competitors use one big core and three medium.


2. To be fair, A14 is just 2 big cores compared to other smartphone SoCs' 4(3+1) big cores. (but Apple's core design is richer than others)


If it was that easy Exynos wouldn’t already lag the A14 so badly.

It should gain maybe 15% in a smaller process, but has a lot farther to go.


I was curious about the actual performance difference and I don't know what the big name benchmarks are, but a quick Google search showed the centurion benchmark [1] which puts the A14 just 5% ahead of the newest Exynos and Snapdragons. That's not too big of a lead? Last years benchmark against the A13 showed a much bigger gap tho.

[1] https://www.techcenturion.com/smartphone-processors-ranking


The industry standard benchmark for completely different CPUs has been SPEC, going back to the days when every UNIX workstation vendor also had it's own custom CPU and instruction set.

Exynos and Snapdragons aren't even close.

https://www.anandtech.com/show/16252/mac-mini-apple-m1-teste...


The benchmark uses Snapdragon 865, not the newer generation. The Snapdragons also run at mobile phone power budgets.



That's single threaded performance though. Snapdragons have more cores at the expense of single core performance.


The Snapdragon 888 only has one X1 core.


1x Kryo 680 (ARM Cortex X1-based) Prime core @ 2.84GHz, 1x 1MB L2 cache

3x Kryo 680 (ARM Cortex A78-based) Performance cores @ 2.4GHz, 3x 512KB L2 cache

4x Kryo 680 (ARM Cortex A55-based) Efficiency cores @ 1.8GHz, 4x 128KB L2 cache

Vs 2 big 3.1 GHz and 4 small 1.8 GHz on the A14.

Comparing single core performance is kinda cherry-picking a benchmark where the A14 is supposed to be ahead.


The A55 is an in-order, 2 wide, low performance core from 2017 that was an embarrassing choice to put in a modern SOC a couple of years ago.

Compare it to Apple's "little" Icestorm cores, which are at about the performance level of everyone else's big cores a couple of years ago.

>The Icestorm design is actually a quite major leap for Apple as it sees the introduction of a third integer ALU pipeline, and a full second FP/SIMD pipeline, vastly increasing the execution capabilities of this core. At this point it would be wrong to call it a “small” core anymore as it now essentially matches the big core designs from Arm from a few years ago, being similar in complexity as an A75.

https://www.anandtech.com/show/16192/the-iphone-12-review/2


I think they'll catch up pretty quickly - there's a limit to architectural improvements Apple can make to stay ahead.


"It's hard to believe that you can use a mobile phone with a high resolution screen and get 8 hours SOT with a varied workload."

Dunno what SOT is but I dont think my >2y old Pixel won't last an our with browsing.


TBF that mobile machine aggressively puts apps to sleep and has other techniques to avoid consuming power. A laptop doesn’t bother in the interest of a different experience.


That begs the question though, why don't they?

My phone is snappy and the user experience is amazing. It's rare that it's visible to the user that this is happening. If I minimize an application on my laptop then why shouldn't it just sleep?


First of all, because the UX is very single-activity oriented (the iPad has a limited amount of multitasking). "desktop/laptop" multi-window/multi-app WIMP interface requires lots of active processes.

Apple did implement the capability to freeze apps and kill apps with no windows but there was a huge outcry and few apps advertise this ability to the OS.


Smaller screen equals smaller power consumption though. Phones keep shutting down apps and keep their screens off the whole time. No wonder you get longer usage.


Laptops are overrated. I had MBPs for 15+ years. But I no longer got to conferences or take my laptop with me, my phone works for most things. Then I recognized that 99% of time I use my laptop on a table with an external monitor. Switched to an iMacPro which already was a much better experience. Now have a powerful desktop which is cheaper than an MBP but more powerful (main reason was ML training though, otherwise I would have stayed with the iMacPro).


They are not overrated. They allow you to move your office any time you want: you can work from office, work from home, work from a hotel, etc. Laptops are mobile workstations and with 2-in-1 they are also big tablets.


I no longer go to hotels. I do no longer work on holidays. I no longer work at home if I have an office, and I no longer work in an office if I do remote work.


But I can take my laptop from my living room to another room if I need more peace and then bring it back to show something to my child and I can bring it anywhere I want without much hurdle... laptop is my mobile computer...


I absolutely have to disagree with you on this.

Laptops are certainly not overrated, and from my experience laptops have been the standard form factor for office work for over a decade, and longer in some sectors.

People constantly on the go have been using them for even longer than that, and unless you have very specific workloads that have to run locally and cannot be pushed to a powerful server (CAD/CAM, 3D modeling, video editing), there are absolutely no drawbacks to using a laptop with a docking station, which lets you take your work with you meetings, presentations, conferences, anywhere a white collar worker needs to go.

(Blue collar workers and others with a need to use a computer out in the "real world" have obviously always used portable machines, since the very earliest ones. Toughbooks exist for a reason.)

It's just a lot more flexible and makes it easier to collaborate with others. Modern docks can run multiple 4K displays, so the need for dedicated desktop PCs has become smaller and smaller.


"Laptops are certainly not overrated, and from my experience laptops have been the standard form factor for office work for over a decade, and longer in some sectors."

Yes they have. a.) Because of a status symbol ("why does only the tech department get Apple laptops?") and b.) because people want to have more options, the more the better. So if you have the option to move the Laptop, you get one no matter if you really move the laptop.

Yes I understand this, I have had 10+ Apple Laptops.


Usually there isn't even been a choice to make. Everyone just gets a laptop with a docking station, standard configuration Lenovo/Dell/HP. If you want a desktop PC, you need dispensation based on tasks or specific needs. It's been over a decade since a laptop was a "status symbol".

A laptop lets you take your work with you, instead of having to futz around with USB sticks and conference/meeting room PCs that never work quite right.

Laptops also simplify WFH enormously, something that has only become even more relevant over the last year. When I started at my current workplace ~14 years ago some of my colleagues had a desktop PC at work and one at home. At my workplace before that, desktop PCs were still the default, which was a whole mess of RDP-ing to your main office PC when you were at the remote server room office, and you'd better hope that nobody had accidentally switched it off while you were away. Luckily for them, they were finally switching everyone to laptops shortly before I left.

With processing power leveling out as it has over the last couple of years, there really isn't much reason to choose a desktop over a laptop, unless you have some specific very power-hungry needs. A decent laptop does 95% of what a desktop does, and the few people who do need that 5% aren't in any doubt of what they need.

(And even then, a lot of those needs could be served by an eGPU)


Performance isnt that important for most people. A 6 year old laptop is plenty fast enough, just weight and battery life suck.


> I doubt Samsung is gonna come even close to Apple in performance

Why not?


They’re not in control of the operating system. I have no doubt that Samsung can create a chip that’s every bit a good as the M1. I just think they’ll struggle to get the same level of performance from Windows as Apple can by controlling both CPU and OS.


Ok my take, it's not the processor, it's the software, iOS gone extra mile making ARM works good, that includes weird tricks with objective-c ... Samsung do not have that! they have Android, it is shit on both performance and batter, so making something for fuchsia?


This is not true. Apple of course does optimize their software for ARM, and there are a handful of "magic tricks" in the processor for some workloads. But in general the processor is just really, really fast.


I am not seeing significantly better battery life from my friend's iPhone


Huawei as well, at least in China, and with a Linux OS:

> Huawei launches a Linux laptop with an ARM-based Kirin 990 processor

> Now Huawei has launched its first laptop that doesn’t feature an Intel or AMD chip. The Huawei Qingyun L410 is powered by Huwaei’s own Kirin 990 processor, an ARM-based chip that was initially developed for smartphones and tablets.

https://liliputing.com/2021/05/huawei-launches-a-linux-lapto...


Truly the year of desktop Linux is soon here!


For that to happen, we'd need a company to invest into good UI/UX. As long as I have to DuckDuck to find the button for creating a new folder in Nautilus ... desktop Linux will remain a niche.


Seems like an odd example to choose, you right click and then "create a new folder" just like in Windows.


Ubuntu is pretty much there UI/UX wise for example. If Samsung provides top notch firmware then I don’t see any problems with a Linux laptop and would want one myself


I dont think so, compared to Mac Os or Windows ubuntu doesn't even come close in terms of UI/UX. PopOs or Elementary feel much more polished.


> feel much more polished

'Polish' is the last thing users care about. If it wasn't true Windows and Android wouldn't be the most popular operating systems over decades and decades.


I find gnome more consistent and polished than Windows 10. They have done some wise design choices and created a fairly minimal environment that looks decent and not the confusing chaosWindows can be.

The point is, better UX does not always win.


Again thats your personal preference, if you give a linux desktop and a windows desktop to an avg person for a week, Im sure they would pick Windows since thats what they can easily identify their workflows with. Linux distros have a long way to go before they're mass adopted. Im from India, and a while ago when, Student laptops offered by few universities had Ubuntu as default OS. Windows installation is the first thing they did.


Around the windows and android days, they were the only mass used OSes. Linux has to compete with Mac UX which normal users find very good also windows UX is so imprinted into people. So by polished I meant that.


Look at Chinese Android phones: They get away with slapping an unresponsive and confusing shell onto their phones and still sell millions.

I wouldn't be surprised if this will happen for Laptops soon as well, with custom in-house DEs. Imagine that Chinese OEMs will be happy to get rid of Windows.


Not a fan of chinees phones. But have you used a mordern oppo, Huawei or xiaomi recently?


yeah, modern miui seems surprisingly well thought-out and designed in some areas, with small details like rippling animations


There's plenty of good UI/UX outside of GNOME. Personally I find the UI/UX in KDE to be better than Windows and macOS's desktop managers. Each to their own and all that though.


> There's plenty of good UI/UX outside of GNOME.

True, but keep in mind that GNOME sets the bar very very low.


Wouldn't this be the year of laptop Linux?


Always is.


To be fair to linux, this time could actually be different with a market shift as big as this. The biggest reason(s) Windows wins consistently is backwards compatibility and a large suite of software/apps. Windows for ARM kills these advantages. If ARM is the future, the playing field is being leveled significantly.


This isn't the first time the industry shifted to ARM and Linux. Remember the Netbook era pioneered by Asus EeePCs but later adopted by lots of others? Back then Microsoft extended the life of XP specifically for those machines and offered massive subsidies to OEMs to ship Windows -- even though that meant throwing in an Intel processor instead of ARM. And eventually Microsoft won because the Intel systems were sold for less to consumers despite costing more to build.

So the story isn't going to magically be different here. If Linux starts to get a foothold then Microsoft will just offer some subsidies to get their OS back on the market. The difference here is that desktop Windows does now run on ARM -- so if anything, Linux has less of a competitive advantage as it did in the Netbook eta.


Those early EeePCs were amazing. This is before the 11" Macbooks and Chromebooks, etc. Back then laptops were full-size and expensive. An EeePC netbook allowed one to have something they could put in their bag easily for only a portion of the cost of a laptop. They were absolutely limited, but if all you needed to do was basic stuff such as write a paper in the library, they couldn't be beat.


They were fantastic, my 901 lasted years - with Debian and a couple of upgrades (storage/RAM) it was a great little machine. Lightweight, compact and rugged.

I even ran OS X snow leopard on it for a while.


Windows for ARM has mostly-working compatibility with x86 and x86-64 Windows apps. It's not as good as Rosetta 2 on M1, but it's present and functional - so the app suite is still an advantage.


Gaming is the thing that locks a lot of us into Windows. Direct3D really has allowed MS to keep the Desktop Windows as a monopoly IMHO (in addition to what you have said).


It helps that OpenGL is an API designed by committee, where the amount of extensions means having to code multiple paths anyway, depending on the card, driver and OS.

Vulkan isn't any better, just the bare basic and then extension loading fun.

Game developers tend to prefer proprietary APIs because vendors actually care about the overall development experience end to end, instead of expecting the community to come up with something.


Wait until you find out who’s on these committees…


I know who are there, and the fact that they rather contribute to the private APIs than comunity tools kind of says it all.

As an example, mesh shaders or ray tracing APIs weren't created on those commitites, rather yet another set of extensions taken from proprietary APIs.


It's not 2017 anymore. The two big things that lock gaming into Windows are anti cheat and video game launchers.

I remember messing with the Warframe launcher and renaming some exes to bypass another launcher. The games run fine. The hyper complicated Chrome Embedded Framework stuff breaks all the time because it uses weird sandbox crap that is intimately tied to the windows kernel.


...and practical Nuclear Fusion is only 10 years away! /s


Don't most of consumer Linux laptops just get pirated windows installed on them?


No. Often (not always, but surprisingly often) Linux laptops work out more expensive than Windows ones due to Windows subsidies and greater economics of scale.

So you'll see people buying Windows laptops to run Linux far more often than you'll see Linux laptops bought to run Windows (pirated or otherwise).


I thought a Linux laptop/computer was the same thing as a Windows version, just without the Windows tax. I have not looked at these offerings in years, so has the Linux offerings from the big vendors matured enough that they are different than the Windows versions?


I believe the idea is that Windows laptop makers make more money from preinstalled bloatware (McAfee, Candy Crush, etc.) than the cost of the OEM Windows license. So if they ship the laptop with Linux which doesn't support the bloatware then they don't make as much money as compared to the Windows version at the same price.


Not just this, though this is a big factor. Often there will be hardware differences too. For example a Linux laptop might run more expensive hardware simply because it has better support with open source drivers (thus satisfies Debian et al users). Whereas a fair number of Linux users are more than happy to run closed proprietary binaries instead. So often Linux certified laptops will be different hardware to their Windows counterparts rather than just Windows machines shipped without an OS.

Then you also have the economics of scale where building a laptop for 1k customers would work out more expensive (price per laptop) than building the same laptop for 100k customers.


There's a free version of windows 10. The only difference from paid version is that you can't change the wallpaper.


The OS that the article says will be installed, Unity OS, looks like it's just Deepin but forked by another company, backed by the government, and not open source, right?

I've recently been looking into checking out Deepin because even with all the distrust in China, open source is still open source and I think it's amazing that an open source scene seems to be flourishing over there as well, at least in some form.


The real meat of the story:

> The new Exynos chip for laptops will use the graphics processing unit (GPU) jointly developed with US semiconductor company Advanced Micro Devices Inc. (AMD) to offer improved graphical technology, according to the sources.


That was my thought when I read it as well, and I have to scroll all the way down to find this comment.

AMD said part of the reason why they partner with Samsung was because they dont compete in Mobile Phone. Which is a valid point. But now Samsung is competing directly in Laptop and PC using these tech. As a tech enthusiast of course I want to see this ARM / AMD Graphics combination. But as an investor I am thinking If we were mislead in a way.

And if this true does it mean AMD is open to license AMD graphics IP?


> And if this true does it mean AMD is open to license AMD graphics IP?

This is not the first time this happens. There was an Intel CPU with AMD's GPU.

https://ark.intel.com/content/www/us/en/ark/products/130411/...


AMD didn't license their IP to Intel. It was a collaboration with AMD GPU fabbed on TSMC and linked together via Intel EMIB.


It could be a good market for AMD. There a tons of companies designing their own ARM processors, but who don't have the resources to also create a GPU. Nvidia is completely absent from that market, at least currently, so it might be a easy way for AMD to push their GPU design.


From a business POV yes. I just dont like companies that lie. Of course at the moment this is still a rumour.


Hopefully this means ARM chips will finally have a good GPU option that has high quality open drivers


It's kind of amazing that Nvidia is so widely reviled for frustrating agonizing drivers filled with blobs, while ARM has been around providing no help, no docs, no assistance, no drivers, what-so-ever for over a decade. No one's been thrilled, but basically doing nothing has been more effective PR than doing a so-so job.

The situation has changed somewhat recently, but basically only after the reverse engineering efforts succeeded & were fully operational.


ARM has plenty of docs, sources, etc. for its Mali GPU: https://developer.arm.com/ip-products/graphics-and-multimedi... Their GPU drivers are even distributed by Debian: https://wiki.debian.org/MaliGraphics

You are probably conflating vendors like Broadcom who license the ARM ISA and processor IP but add their own (closed) GPU to SoCs they produce, like those used in the Raspberry Pi.

Remember ARM is just a company that license and sell an ISA and chip design. It's up to system integrators to turn those designs into physical chips, and sadly we have seen many of them don't value making their designs available to open source, etc. The solution to this is to not support those vendors, not to complain at ARM.


I've spent about 10 minutes clicking through looking for anything even mildly technical. The first link you sent me, "plenty of docs", is a 2 page PDF product brief & 5 web pages describing some features at a high level. There's a ton of docs... for game developers, application developers, &c. Talking up features & capabilities. But for making drivers? For describing how to use the hardware? There's some links to embedded development / DDK development sites for arm, but none of them are open accecss. ARM wont help.

The Debian drivers you cite are the Panfrost Lima drivers, which are the reverse engineering works everyone else did in the dark with no help from ARM, until very very recently.

I am not conflating anything. ARM has provided no documentation no support to help folks make drivers. I stand by my original post entirely, and disagree deeply. ARM's gpu chip design is stamped out into countless chips, verbatim or close to verbatim. Even if individual chip makers did want to support their chips, I suspect it is in fact ARM who would block them from releasing open source drivers, from documenting the chips they make: ARM's desire has been to protect it's IP, and to help the various embedded support partners that do know how to work with their chips and their IP in a secure fashion. ARM has been a bad player in the Linux world, has worked against allowing Linux support. The problem is absolutely ARM, not those vendors.


Can someone explain to me how ARM expect to protect its IP if it doesn't publish anything about it?

I was under the impression that in order to protect IP, one has to either patent it or copyright it, thus some form of publication must be made. If ARM does publish technical documentation (1), ARM has a way to prove infrigement or prior art whenever someone else tries to use their IP without their approval. If ARM doesn't, how can they prove anything?

(1) Obviously, if they publish the tech. docs, they might as well help driver implementors, might they not?


Mali is every bit as proprietary as Nvidia.


> Mali is every bit as proprietary as Nvidia.

I don't think anyone was claiming it isn't. I think GP's (and his GP) point is that ARM /usually/ makes the information needed to develop high quality open source drivers available where as Nvidia tends to do the opposite.

Whether these technologies are proprietary or not is orthogonal to this issue.


But it absolutely doesn't, that is the point. There is no register level documentation or documentation for the instruction set the chips use. There is no pertinent architecture documentation at all. Having a list of performance counters does not equal documentation.


Android is a hellscape of proprietary drivers but desktop Linux is basically down to one: Nvidia. It stands out more because it's the last holdout.


Desktop Linux on ARM is in a similar situation to Nvidia, with the proprietary Mali drivers from ARM vs the incomplete reverse-engineered Panfrost / Lima open source drivers.


Qualcomm is pretty widely disliked for doing the same. Google even called them out for creating security problems by making updates hard (although they weren't quite as harsh as Linus was.)


Amazing what they were able to do in such a short period of time after completely stopping R&D on both their custom CPU and GPU in 2019, opting to instead form a strategic partnership with AMD for the new exynos chips.


There's a bunch of comments here about performance, but IMHO even 10 years ago hardware was already fast enough (and any gains in speed are just being taken by increasingly bloated software); what I'm more worried about is openness --- will Samsung release detailed documentation for its SoCs, on par with what TI and other ARM SoC companies have done, or even better?

I really hope Apple's M1 does not start a trend towards even more computers with basically zero public documentation.


Personally I searched the web for how to run a normal Linux build on the Samsung Exynos 2100 SoC. Could not find much information at all. Very sad state of affairs compared to an rk3399 or iMX8. Especially considering how much more performance the Exynos SoC appears to deliver. I am looking forward to the new Rockchip offerings (rk35xx), and the continued progress on mainline support for the VPUs. Not holding my breath for the Samsung or Qualcomm SoCs.


> even 10 years ago hardware was already fast enough (and any gains in speed are just being taken by increasingly bloated software)

This point doesn't stand the test of time, much like how Gates said "640K ought to be enough for anybody" (except he didn't[0]). Computers will keep getting faster so that developers can keep abstracting away 'hard' parts of development, leading to modern-day bloat most would consider Electron and React/Vue/Polymer. This is only considered 'bloat' until the hardware is fast enough and has enough memory to where the slowdown caused by the bloat is negligible and it simply becomes a tool that increases productivity and the amount of products/features consumers see each quarter.

0: https://quoteinvestigator.com/2011/09/08/640k-enough/


Isn't there a saying along the lines of "infrastructure is cheaper than developers?"

I've mostly seen it in a cloud computing context, meaning it's cheaper to pay for more/larger servers than it is to pay developers to make their code more efficient. But it also makes sense on a more basic hardware level.


I don't know the saying, but it makes sense - it might take 10 years of Azure runway over a few years to pay developers to remake something made in ASP.NET into a static binary that can handle 10x the number of requests on the same processors. But at that point, businesses aren't going to spend time making it faster if they can pay more to have a faster development cycle and get cheaper talent in the long run.


I am still using my 12 year old Intel® Core™ i7-870 for gaming, working, and doing other stuff.

I just use SublimeText (not bloated) instead of Visual Studio Code (bloated) and HeidiSQL (not bloated) instead of MySQL Workbench (bloated).

So, for my personal experience, yes this point has totally stood the test of time.

Will I get a new computer? Yes, eventually, when supply gets back to normal prices or when my current computer can no longer be of use to me.

Also: fuck bloated software. It is a much bigger waste of electricity than Bitcoin.


True, but you’re not getting things like language servers or the wide range of extensions available with VS Code. It’s like how GRRM still uses a DOS machine running Wordstar - it works with old software but you give up all new features and conveniences afforded by faster CPUs.


I have used language servers with Sublime Text.

It also has a rich ecosystem of extensions, it is not like VSCode invented extensions or something like that.


Nope, it'll be Android, ChromeOS, and Windows only with no docs and three different bootloaders.


yeah, if ARM takes over the laptop market it will be the beginning of the end for open software. They'll be shipping like phones do today. There's no realistic alternative to the OEMs stock OS unless you trust shady and often botched ports. It will also be the end for pirated software.


Of course this depends on what you're trying to do. If you're doing 3d rendering or video compositing or heavy audio DSP or gaming you need all the power you can get.

I share your concerns about openness though.


What, their chips aren't good enough for their own phones, but they think they're good enough for dedicated laptops?


Yeah, this phone: https://browser.geekbench.com/v5/cpu/7821516

Probably has plenty more power than this laptop: https://browser.geekbench.com/v5/cpu/7185745


That's an interesting point, actually. Samsung doesn't need to compete with the M1, it only needs to compete with Celerons and Athlons to make a profit.

As far as most consumers are concerned, even Exynos chips may be worth it for desktop use because they can use the extensive video acceleration support that mobile GPUs have which many desktop chips still lack.

Sadly, the Windows world is far from ARM-compatible. Few people install applications through the Windows Store, and bringing Linux to the masses with an alternative, low-power laptop processor will probably end up failing as well. I fear that the lack of software support will prevent Samsung for replicating Apple's success.

Maybe Samsung can stuff these chips into Chromebooks, those provide a limited yet cross-platform operating system that some markets are getting used to already. There are plenty of ARM Chromebooks out there, so I'm not sure how well Samsung could compete in that space.


They ditched their own inferior custom core design and now use ARM Cortex X1.


They already run a Samsung desktop called “Dex” on their phones.


They also had Linux on Dex which was a full Linux desktop environment. Unfortunately it never left beta and was killed off.


I think ultimately they couldn't make it compatible with Android 10 and that lead to its demise.


Dex isn't bad. I use it from time to time. It's really not much different from a chromebook. I'd use a dex laptop if it was available.


They do use their own chips in their own phones. Just not in every market


The latest exynos chip replaced Samsung's custom CPU designs with stock ARM ones, presumably because exynos variants of the S20 lagged far behind the snapdragon variants in performance and battery life.


This could be temporary thing, even qualcomm has done it in past but gone back to own designs after 1-2 generations.

My guess is that they needed the engineers for AMD GPU integration and ARM happened to have a good design at that time so the choice was simple.


I think the original comment was referring to the initial backlash Samsung received after putting inferior Exynos Chips into non-US devices, that were significantly less performant while drawing more power than the devices with the Qualcomm counterpart.


They don't sell exynos phones in the US to avoid patent wars with Qualcom.


The previous N generations of Exynos weren't good but this time will be different. Because AMD is magic sauce or something.


It wasn't too long ago that Samsung Chromebooks were powered by an Exynos. Even a lightweight system like ChromeOS was quite slow...


https://browser.geekbench.com/mobile-benchmarks

The 2.5 year old XR is faster than Android flagships.

I wonder what tricks Samsung has to close the gap.


I bet the trick is that the new exynos laptops are just like the old ones: low end.


I have a Samsung Chromebook with a Rokchip ARM processor. Its 3 years old and is still blazing fast, with multiple tabs open, video conferencing etc with amazing battery life. Would be nice to see what they can do with their own chips.


I wouldn't call RK3399 blazing fast, but it's quite usable.


The GPU in that chip is so slow it can't fill the screen with a solid color at 60 Hz.


It's that so? How c can then Chrome OS display anything then? Displaying a desktop environment with multiple apps and/or moving webpages must be impossible.


None of it happens at 60 FPS. Dragging windows, animations, etc are all janky on that GPU.


True, for me it I should have put in bold that for a 3 year old device its fast. Based on what I do with it as described above it does the job without having me to throw it against a window since its slow


Says more about chromeos than the processor. Try installing windows or ubuntu gnome on it.


Ubuntu GNOME would probably work fine, except when it comes to video. People run similar stacks on the Pinebook Pro, which is the same SoC design. Unfortunately using a mainline stack with the rk3399 will still get you CPU en/decoded video, rather than using the dedicated VPU. That seems to be changing soon.


I would have thought AMD's licensing agreement would prohibit Samsung products including Navi 2 from competing with AMD's own laptop-focused products (Cezanne & rumored Rembrandt).

>AMD will license custom graphics IP based on the recently announced, highly-scalable RDNA graphics architecture to Samsung for use in mobile devices, including smartphones, and other products that complement AMD product offerings.

So I would only expect this in Samsung laptops, where so far they have been hesitant to use AMD SoCs.


I think it’s a customers first approach, much like Rimac selling vehicle technologies directly to it’s competitors


Technically, Samsung jumped into this market already in 2012 (Samsung Series 3 chromebook had an exynos CPU).

I think the real news here is the AMD collaboration.


There was an Exynos chromebook nearly 10 years ago! It was pretty great for what it was. Slim, light and cheap!


This one (https://en.wikipedia.org/wiki/Chromebook#Samsung_Series_3) with the Exynos 5? In terms of value for money it was, in my opinion, the best computer ever sold. One of those cost £229 and was in constant use for about 8 years, carried around in a rucksack, used for e-mail, web browsing, video calls, iPlayer, editing files on Google Docs, ...

If they could make one of those that runs Debian with the security features of a Chromebook but without the Google dependency ... it would be too good to be true.


Yes, that's the one.

I did have mine set up to dual boot Ubuntu back then, it was a reasonably capable little machine. Wouldn't like to comment on the security.

As you say, it was rugged and cheap enough to throw around wherever, and you can do enough in ChromeOS to get by. I only eventually ditched it when updates stopped being available.


I'm more interested in their new 360 laptops, those screens look amazing and 11gen Intel CPUs are tolerable from what I've seen, the iGPUs are decent enough to drive 5k monitor.

I would like an ARM option but I'm afraid nobody is getting close to M1 soon, going on the rest of the ARM ecosystem track record these will be considerably worse than X86. Unfortunately Apple isn't big on providing choices or canibalizing iOS market where they get a 30% cut of everything going on the device so I have to hope companies like Samsung get it right.


> nobody is getting close to M1 soon

Except the former M1 team that founded Nuvia, soon to ship Qualcomm's laptop SoC.


>The first Qualcomm Snapdragon platforms to feature Qualcomm Technologies’ new internally designed CPUs are expected to sample in the second half of 2022 and will be designed for high-performance ultraportable laptops

So they will have enginering samples by the end of next year ? When I said soon I sort of meant in the two year period, in the long run Intel is likely to catch up on the process as well - but devices shipping in the next 2 years will be underperforming compared to Apple M processors.


Didn't Nuvia have internal samples in 2020? Good for Qualcomm if their competitors don't expect M1-class product competition until 2023.


Interesting, since this didn't happen after the Surface debut but did happen after the Macbook debut. Of course the processors are better now.


Hopefully this is competitive with the A1. I would really like to use something non x86 as my daily driver just for the feeling of moving forward.


You have been able to buy ARM windows/ Linux/chromeos devices for years now. What makes this one more important?

Neither this or M1 are market leder in raw performance or power efficiency either.


I meant the M1, which is a market leader in terms of performance per watt in the same performance class.


Not necessarily, I need a _proper_ benchmark against AMD 5xxx to belive that.

Edit: proper means not using one M1 big core for single threaded performance benchmark, only the little cores in a battery test and then combine those two numbers for a performance/watt comparison.

You can't compare big-little with all-big in this fashion.


Anandtech has you on that, hell, that's not even how the M1 works, when at full use the M1 uses all 8 cores pooled.

The M1 is a quad-Big core, quad-little core Chip with a roughly 30W TDP at max use, (that's the Performance/Watt number to work off, Mac OS reports it at 5w per core + 3W for the little cores combined), this is including it's GTX 1050-class GPU. We have the numbers, we know where things stand.

Fact of the matter is that Ryzen 5000 is just 7nm chips and the M1 is 5nm.

For a ULV chip, it is shockingly competitive and superior to 5000-series chips in single threaded (Most GP computing is still threadlocked.) and lightly-threaded ( up to four core use) tasks. A critical thing to realize about the M1 is that it lacks turbo modes, so single-thread performance scales with cores unlike most x86 Chips that slow to handle more than 1 core at full use.

5000 series chips with more than four cores have advantages in massively threaded workloads but that is absolutely not a reasonable comparison with the M1 in the first place as they are much higher wattage chips in a very different market and comparing a 45w chip with the M1 is a joke to start in the same way you just explained above. The M1 is a sweet spot chip that is not better than everything at all wattages but in a lot of cases is much better at most things. I think a lot of the praise has driven some untrue claims of how 'amazing' the M1 is and made some people think that it compares with threadripper or such in multicore when it's not true and that has driven some unfair skepticism against the M1 as well. Watt per watt, core per core, the M1 is faster and cooler than anything AMD has, once you let cores increase and wattage rise, you will find faster multicore performance.


We are not comparing M1 to 64-core threadrippers. We are comparing 8-core M1 to 8-core AMD and intel.

Which are faster? AMD by a huge margin, then Intel and Apple.

Which are more power efficient? Probably apple by a wide margin. But we need better benchmarks to confirm that. Many I have seen so far just do automated Web browsing or YouTube videos until battery dies. The latter is not even CPU-bound, the former is mostly a browser and display efficiency benchmark. Most likely neither use the Firestorm cores much.

Does power efficiency even matter? My answer would be yes for laptops/tablets and no for desktops which are already running comfortably below 20W (see for example Asus PN51 series).


That's not needed. The peformance is there when needed, as plenty of real world benchmarks show. That isn't true of other laptop ARM CPUs.


Will they be able to support TSO (total store ordering) for faster x86 emulation, or did Apple manage to patent that?


“Real men have fabs.” - AMD founder, Jerry Sanders.

this is the new age. everyone can design a custom chip for its product without "real men have fabs". this is exciting. your custom chip designed for your own product can make your product stand out like Apple M1.


Uh, won't this chip be built using Samsung's fab?


Samsung phones, Samsung Exynos, and Samsung fabs are essentially separate companies.


Sure, but the last flagship Exynos (2100) used the Samsung 5nm process.

https://www.anandtech.com/show/16316/samsung-announces-exyno...


Can Samsung do 5nm in their own foundries?


Yes, but the transistor density is lower than TSMC's 5nm.

https://en.wikipedia.org/wiki/5_nm_process


will it run windows or what is their aim?


FTA: "If the product hits the market, Samsung would become the first modern Windows laptop manufacturer to launch its own processor that works as the brain in its machines."

That doesn't actually tell you what OS they're using, and it could just be the journalist trying to indicate how Samsung is the first at what Apple already did.

But Windows 10 does run on ARM as long as you're willing to use "modern" apps only. I suppose this won't be a machine developers will be excited by, as excited as some of us want to be.


>But Windows 10 does run on ARM as long as you're willing to use "modern" apps only. I suppose this won't be a machine developers will be excited by, as excited as some of us want to be.

Windows 10 for ARM have always had x86 emulation and they've recently extended it with x64 support: https://blogs.windows.com/windows-insider/2020/12/10/introdu...


Likely Windows ARM. It should prove to be a stout upgrade to the units MS currently uses in their ARM laptops.


MS failed in the mobile market and didn't bother to prepare ARM version of Windows and now Apple will be enjoying the desktop/notebook market with Wintel having at least 5 years of catch up to do.


Probably chromeos


It took a while to find that this is a Cortex ARM CPU.


There's rumors of Google ARM silicon coming this year based on Exynos. We might see a similar chip in future Pixelbooks/Chromebooks.


There is no way I would buy a machine with FANG in the silicon.


I thought Google was doing it own design.

Do you have any pointers to these rumors?


A good summary of the rumors… GS101 “Whitechapel” chip: https://9to5google.com/2021/04/09/everything-we-know-google-...


Hope these ARM laptops still keep priority to NVME storage as a first class citizen instead of soldered storage


> In November 2020, Apple Inc. launched the M1 chip, a desktop- and laptop-grade systems-on-chip (SoC) based on a British processor architecture known as ARM.

Seems strange to call this out and not mention that the Exynos SoC is also based on ARM.


What else could it be? It's used in some of the Samsung's Android devices.


Intel is so fucked..


In 10 years, will we see ARM everywhere?


this comment should show up on my program


Please no. ARM chips are a nightmare enough for compatibility on phones and tablets, even devices several years old. I don't want to need custom patched Linux kernels just to boot my fucking laptop! Oh but I already know. That's deliberate. They're trying to prevent you from escaping their litterbox with a free OS. Because if you're trapped in the litterbox, all you have to eat are their tootsie rolls.


Contemporary ARM Windows laptops adhere to well documented standards. Booting Linux isn't a major challenge.


"Full" Windows on ARM devices provide well architectured platform, unlike Linux-oriented throwaway designs. And aren't locked (unlike Windows Phone and WindowsRT devices), because that would prevent Microsoft offering them in their highest margin market (Enterprise) where one of their offerings requires customer-managed signing keys for Secure Boot.


As an early adopter, I pounced on the cell cpu powered 12” MacBook. It was horrible. Still an early adopter and love my m1 MacBook Air. I hope Samsung’s chip is closer to the latter experience and not the former!


It is telling that there are many obvious ARM fans in this thread, but not a lot of proud M1 owners. It seems Apple really screwed the pooch by failing to ship some sort of bootcamp for M1 to make Linux possible.


Apple obviously wants you to use MacOS. With that said the M1 fully supports booting unsigned/custom kernels, so running Linux doesn’t require any security exploits. Asahi Linux aims to bring you a polished Linux experience on Apple Silicon Macs, sounds like some of their code might make it into the 5.13 version of the Linux kernel. Oh and M1 doesn’t have bootcamp but you can run windows ARM with Parallels.


That code can only boot to a serial console. It is a start, but it will not produce a usable linux system in any way. It doesn't even touch drivers for the GPU/NPU/Bluetooth/WiFi/etc. By the time they can produce a minimum usable system which boots with a display, keyboard, headphones, usb, ssd, and trackpad, the M1 will likely be end of life. M2 went into production two weeks ago.

https://asia.nikkei.com/Business/Tech/Semiconductors/Apple-s...

You'd need to buy one now, speculating that someday the Asahi work will produce a barely usable system on a laptop you can't even buy anymore. Considering the 2016 macbooks are basically unusable with Linux now 5 years after release, I don't have a lot of hope the M1 ever will be.

https://github.com/Dunedan/mbp-2016-linux


Well duh. Apple released a laptop processor last year, so of course Samsung are doing one.

Samsung tags soon no doubt.


Rarely is Apple the first to do something. Usually they're the first to do it well, with high quality, and when they think the market is there for it to become more ubiquitous. Sometimes they'll pull a few things together that weren't previously, but for the most part they excel at providing a polished, high quality experience for something that already existed.


Perhaps they're referring to things Apple does that don't involve "doing it well", such as removing headphone jacks or USB power adapters, or adding the iPhone notch.

Samsung marketing team had field days on each of these moves, then the following release cycle their products did the same.


This doesn’t change the fact that Samsung always immediately clones whatever Apple does


How is that possible when Apple's clone is based on whatever Samsung already announced/releaseD?


They copy each other but make minor errors in each copy. The fitter copies survive. That's called "evolution by natural selection."


You know Samsung lost a billion dollar lawsuit because they stole the entire iPhone UI right?


No, it was $500M and centered about this design patent for rounded corners: https://patentimages.storage.googleapis.com/fa/21/81/670fbfd...

That's it. These clipart pictures really are the complete patent.


you mean the lawsuits Apple couldn't win outside Koh's court in Apple's backyard?


Both examples ( arm laptop and tags) was literally already done by Samsung ..

Samsung was also earlier with the smartwatch for example. What else in the past 10 years? Earbuds!

Note: iPad and IPhone was earlier, but that's already more than 10 years ago. Where the first tablet was way ahead of it's time: 1989 with the GRiDPad 1900.


Yes but Samsung copied the iPhone so closely that they lost a 7 year $1 billion lawsuit and ended up settling out of court after appealing to the Supreme Court

They almost exactly copied the major UI features in violation of the patents so it goes well beyond whether someone had made any kind of tablet before

The Galaxy Book Pro (well at least they changed one word from Mac Book Pro) has always been a close emulation of the MacBook line as well


Halve a billion about round edges imo.

And an Ultrabook is just a thin laptop, Apple doesn't have any patents there, because they were not the first in that. Notice book being in the actual generic name?

Not a strong argument for over 20 years...

Who even said: good artists copy, great artists steal? :)


Samsung Galaxy SmartTags exist since January 2020



> The device was announced at Samsung's Galaxy Unpacked event on January 14, 2020.

- https://en.wikipedia.org/wiki/Samsung_Galaxy_SmartTag


AirTags were leaked via trademark in 2019.[1] Tile has been around since 2012.[2] Someone on HN said something about whistling in the 80s.[3] Who did it first?

[1] https://www.macrumors.com/2019/10/28/airtag-trademark/ [2] https://en.wikipedia.org/wiki/Tile_(company) [3] https://news.ycombinator.com/item?id=27106048


Yeah I didn't say they were first I just corrected the person that I responded to about the date that the Samsung product was announced.


And the Galaxy Book S was already announced with ARM in August 2019.

1 year before Apple's announcement with the ARM six.

Edit: And a Samsung Chromebook with a Rokchip ARM is in the market since January 2017...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: