You may find this article I wrote useful background for understanding the capabilities of numerical computing in Swift, including details about a library which provides features overlapping with the new Swift package: https://www.fast.ai/2019/01/10/swift-numerics/
Thanks, it is a well written article. The only part which tarnishes an otherwise excellent article is the language comparisons portion. Such things are best avoided as they almost always reflect the biases of or where the author has the most experience.
As you acknowledge, you were unfair to Julia. Julia has excellent support for general purpose programming. It has support for multiple dispatch, a quite powerful organizational construct rarely found in other languages, for example. Where it might fall short is in the number of general purpose libraries but that shouldn’t matter as much for ML. It certainly isn’t lacking for numeric libraries.
For Java, it’s unclear what you mean by JVM issues but with GraalVM and Scala or Kotlin, I fail to identify any major issues.
F# on .net core can be distributed without installing a runtime nor does garbage collection limit its expressiveness. It is among the fastest, in the Go/Java/Ocaml/Swift tier, which is next fastest after C/C++/Rust/Fortran. Compare https://benchmarksgame-team.pages.debian.net/benchmarksgame/... to https://benchmarksgame-team.pages.debian.net/benchmarksgame/.... Setting aside the problems of microbenchmarks and unidiomatic code, they’re both within spitting distance of each other, both wining and losing with no significant margin. Their runtime also recently introduced SIMD and fine-grained memory management support.
I'd say practically speaking, Swift's current great weakness is its cross platform support, when compared to Julia, Kotlin/Scala or C#/F#.
Swift is an excellent language and it’s great a numeric programming ecosystem is being built for it. There’s no need to justify its existence by misrepresenting or downplaying the capabilities of competing ecosystems. I also saw this occur for why differentiable programming in Swift and not Julia. The more the merrier, I say. Let the various communities collaborate and share ideas in order to better explore the space of possibilities and ideas.
On paper F# seems pretty great. Unfortunately, in practice it was the first and so far the only language I've ever used that I never found myself enjoying.
I wrote a simple, but relatively robust toy language interpreter in it, so the time I put in wasn't trivial and I kept thinking it would get better, but no such luck. This is just my anecdotal experience though, I know some companies have been wildly successful with F# as the technical foundation for their entire platform. Jet is a great example of one such company.
Certainly, not every language will fit everyone's preferred style of thinking and approach to breaking down problems. This is why I think a diversity in available options is so useful.
stop the world GC (my car ran you over because of a random unpredictable pause!)
its bytecode is object-oriented
impossible at the moment to do CUDA style GPU programming without horrible JNI calls
> F# on .net core can be distributed without installing a runtime nor does garbage collection limit its expressiveness
F# has two big crippling factors: (1) the .NET at the end of its name and (2) functional programming, which is a huge barrier for 99% of developers (and it's not functional enough for the other 1%). F# is dead in the water--I mean that respectfully. To replace Python for deep learning, you need a language that other people will actually use, not should.
> stop the world GC (my car ran you over because of a random unpredictable pause!)
Given a certain baseline of available resources and for the places where GC issues can be problematic (regimes I do not believe common) unpredictability of GC is the main issue, which can be readily addressed. You can for example, allocate and then manually manage or opt for a specialized JVM. If startup-time is an issue there are native compilation options. Or just not use the JVM, that's an option too.
I'll also note that Automatic reference counting is no panacea either and Python is no better suited for the scenario you've given. Personally, I've not found this to be an issue given modern low pause concurrent GC but your mileage may vary.
> its bytecode is object-oriented
> impossible at the moment to do CUDA style GPU programming without horrible JNI calls
I don't think these are deal breakers. More easily writing kernels in a high level language is very much an open problem. Even Tensorflow faces this issue, with most workflows optimized for a handful of prewritten kernels. On the JVM there are options such as https://index.scala-lang.org/thoughtworksinc/compute.scala/b... or http://aparapi.com/ for GPU backed ND-arrays or JVM translation.
> F# has two big crippling factors: (1) the .NET at the end of its name and (2) functional programming, which is a huge barrier for 99% of developers (and it's not functional enough for the other 1%). F# is dead in the water--I mean that respectfully. To replace Python for deep learning, you need a language that other people will actually use, not should.
I don't think (1) is true or even if so, see its technical relevance. If you're doing numeric/array-based differentiable programming then you shouldn't have any problem with functional programming. I'd even argue functional programming makes things easier as you get many things free from existing combinators. Many concepts are naturally expressed with the features such languages tend to have.
I don't think Python needs replacement, I'd much more rather see interoperability and language agnosticism. I can tell you that for each of Julia, Haskell, Scala, Kotlin, Ocaml, Nim, F#, Rust and of course Swift, at least one fascinating machine learning library is being built. I think that's a great thing.
I do not believe this generalization is accurate. It wasn't me who said that but the person I quoted. F# is plenty functional but what it does lack are organizational constructs which reduce code duplication. In practice, these are not deal breakers for what you get in return (especially that you can work around them by leveraging more OOP features, which isn't necessarily a bad thing) but that as always, depends on preferences and priorities. Among functional languages, F# is pretty streamlined, believe it or not. F#'s pragmatism means the subset of Haskell that gets used in practice to keep things simple shares a fair amount of overlap with F#. Even more-so for Ocaml.
But, the person might have meant something like, if you already know Ocaml, you might complain about the lack of functors or polymorphic variants. Similarly, if you already know Haskell you might complain about the lack of higher-kinded types or GADTs. On the other hand, F# offers its own features (clean implementations of active patterns, computation expressions, type providers, async, multicore) and advantages, mostly from being able to leverage the .NET ecosystem (which also explains why those specific features are missing).
So your scenario there is a separate thread that doesn't generate or free garbage? Because if it does, there will be synchronization inside malloc/free and your thread will pause. Also the Swift compiler is unpredictable about when it will use the heap instead of stack, allocations are not explicit. So it might be hard to accomplish in practice.
I agree that this special case could be more realtime with ARC, but on other GCs such an isolated worker can be implemented as a separate program instance (separate process or separate VM instance in same process) that has memory allocation and GC disabled after init.
In the other cases where you keep whe GC on, I think the low-pause GCs of today should be suitable for tasks like computer vision in cars because the pause times are so short (sub-millisecond) vs the processing time for a frame. And the generally higher performance of non-rc GC should more than make up for it.
Technically it is a form of garbage collection. However, garbage collection is a very broad term that arguably encompasses even C++ smart pointers. Typically one refers to a garbage collector in the meaning of a separate process that runs independent of the application. Either in the background or “stopping the world” to collect.
ARC is part of the application code. It is also visible to the compilers optimizer, which can optimize away redundant redundant reference count operations.
Typically one refers to the CS definition of automatic memory management, using accredited references like "The Garbage Collection Handbook: The Art of Automatic Memory Management".
By bundling. I believe there are couple a options to reduce the resulting package size some. As with all things, it's a matter of priorities when weighing language and deployment options.
It does not. Because, since both systems in comparison are in violation, when normalized to just this setting, the comparisons are meaningful. That is, while microbenchmarks are not something that should replace task relevant testing, they do have utility as a coarse indicator. The information should be taken with a large helping of uncertainty but it also points in the general correct direction in terms of relative ordering of the compared.
The reality is often that other things will dominate. Things such as computational complexity, appropriateness and optimizations of data structures in use, I/O bounds, cache locality and specific details of the problem that will tend to reduce and not magnify the differences between languages near each other in a relative ordering, when things are done properly. Or slow things majorly when things are not done properly. This holds especially if idiomatic code is not anymore expensive to write in any of the compared languages, as is the case here.
Does anyone use Swift (in production) for backend services not related at all to the Apple ecosystem? It seems like a better Golang to me, but the tooling space only targets Apple stuff.
The issue with swift on the server currently is that it’s an entirely different compilation toolchain to that run on MacOS and iOS, so the language has different bugs than those that exist for app development, and gets less attention from swift developers.
If you want to take a stab at server side swift, I’d recommend looking at the swift port of Netty that Apple released called swift-nio (after the name “swetty” was nixed by Apple marketing and communications) - https://github.com/apple/swift-nio
Honestly that's no different than realizing that AVFoundation or CoreGraphics is Apple-platform-only...
Network.framework is an Objective-C framework (I'm honestly surprised I don't seen any C++ symbols in the binary) – it was never supposed to be a cross-platform framework.
SwiftNIO is the official base of the OSS Swift cross-platform networking toolchain. It's the equivalent of Java's Jetty package.
I've pushed a few admittedly minor backend services in Swift to production. It is ok.
You basically have one viable non-Apple OS platform (Ubuntu) to deploy on. This means that your basic Golang service is a 10MB Docker image while it can be over 100MB for a basic Swift service. There are frameworks like Swift NIO which is based on Java's Netty (and there are some Apple developers who work on Netty also working on Swift NIO). It works well enough but in most benchmarks, Swift is not even close to Netty ( https://www.techempower.com/benchmarks/#section=data-r18&hw=... ), so if you're pushing the code expecting high performance, I would think twice. Swift gRPC (latest version based on Swift NIO) is also available, and while I think it works very well, it is still relatively new.
As far as tooling, Swift is very immature IMO if you step outside Xcode. There are efforts to get a LSP service fully working (SourceKit-LSP), but I find it to be ok at best (performance and code completion suggestions are often very hit or miss). From benchmarking to diagnostics/backtraces to logging and metrics frameworks to shared common knowledge/answers on Stack Overflow, it is still very early days for Swift. Golang is so far ahead here that I personally think (at least today) the only reason you should launch a Swift service into production is because you want to reuse code that you have in your app.
If you like Swift's type system and want a backend service equivalent, I would strongly recommend looking into Rust. IMO, Rust is a version of Swift where the programmer is given more control of what the code is actually doing (along with the associated responsibility). It sounds like a lot of trouble, but I find most Swift code naturally translates to Rust code (especially if you follow the "value vs reference" semantics ideology that the Swift compiler team advocates). At worst, if you learn Rust, you will understand Swift a lot better (like what really is an escaping vs. non-escaping closure or what is the difference if I use a generic versus a dynamic protocol type (dyn trait in Rust) in a function definition).