Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
APL Tutorial (zerobugsandprogramfaster.net)
123 points by ktRolster on Aug 9, 2016 | hide | past | favorite | 55 comments


It is a pleasure to see APL crop up on HN from time to time. APL was the first language I learned, back in the 1970's. "First love," and all that. I remain a devoted user of the language, though only as a thought-amplifier and prototyping tool.

The big key to the language IMHO is the idea of computation liberated from time. This is what I think Dijkstra was getting at when he railed against "operational thinking". Instead of thinking of programming in terms of mentally replaying the execution sequence of the code you are writing, you can step outside of time, as it were. Computation becomes a way of thinking about mathematical transformations at an abstract, extra-temporal level. Dijkstra described this as programming via "predicate transformers". But he was still constrained in a scalar world. Iverson, on the other hand, encouraged people to dream in abstract aggregates of data. His mathematics of arrays was at once simple and rich, ripe with expressive possibility.

In my view, the functional programming people are doing beautiful work creating new realities and ways of thinking within this cognitive universe.

So, (pun alert) APL remains unsurpassed in its beauty and timelessness.


The earliest programming languages were the most creative. APL and Lisp are inspirational.


I got started with Machine language, APL, LISP, C, FORTH, roughly in that order. Used them all professionally for at least ten years (still using C, of course).

Truly an excellent way to venture into computing.


Makes me wonder if Knuth agrees with Dijkstra about the preidcate transformer?


Your comment is poetic.


For anyone interested, I noticed that Dyalog made their proprietary implementation of APL free for non-commercial use just recently [0]. I suspect this change might have been in reaction to an uptick in traffic to their site from this other APL HN post[1] a bit more than a month ago (there's some good discussion their as well).

[0] http://www.dyalog.com/ [1] https://news.ycombinator.com/item?id=11963548


They also have Mastering Dyalog APL by Bernard Legrand as a free PDF download. I've just starting reading it, seems to be a very gentle introduction to APL: http://www.dyalog.com/mastering-dyalog-apl.htm


I used it to learn APL, and it really is a great resource.


I learned C and IBM APL2 around the same time for financial apps. For APL, i read the Polivka and Pakin book about 30 times, not this edition, but the red paperback I used is apparently impossible to buy. Excellent book, it got me to where i could be competitive in obfuscated APL contests: https://www.amazon.com/P-L-Prentice-Hall-automatic-computati...


I have pictures of my younger self at an APL conference in 1983 with Polivka, Pakin, Ken Iverson and other APL luminaries of the day. Small community.


APL was hot in Poughkeepsie, where I grew up. My first APL program was a Mastermind simulator (on our high school's Telex connected to a timeshare system). My dad worked for IBM, and Ray and Ken were household names.


Did you go to PDS (Poughkeepsie Day School)?


"Obfuscated APL"? Isn't that a pleonasm?


I agree, it is one of the best tutorial/technical books I have ever read. Very clean writing style, and the authors clearly enjoy their topic.


> i read the Polivka and Pakin book about 30 times

I hope that's because the book was more accessible than the computer platforms at the time, otherwise that is a steep learning curve.


Nah, just think that he was able to go from relative beginner to a complete expert with a single book. Rare to find such a multi-layered book that has in-depth knowledge, but is still useful to beginners.


Check this out - you miss out on the ambience until you actually see it in action - https://www.youtube.com/watch?v=_DTpQ4Kk2wA


Great video, thanks for sharing! A bit embarrassing that Professor Bob Spence teaches at my/our alma mater and I didn't realize it was him until checking out the description at the end.

Wish I had used my time at University more productively, so many interesting people and subjects concentrated in one place. It went by very quickly and most the time was spent worrying about passing exams. Seems I'm much more interested in some of the subjects nowadays, I certainly would have appreciated the experience more now than at that age.


Livecoding sessions d'époque, I love old programming videos!


Anyone know of a good place to buy teletypes like this?


You may want to learn J instead. It uses ascii, and it's from Iverson, so it's the true continuation of APL.


Or K (a JS interpreter called oK with an interactive coding env called iKe can be found in John earnest github account).

It is a distillation of APL/J concepts. Much smaller core, clicks better than J for some people.


Or Nial, which is APL in plain English.

http://www.nial.com/OpenSource.html

They're working on a distributed and GPU versions.


I didn't know about any new effort with Nial. I'll have to check it out. The distributed and GPU parts are especially enticing to me. CUDA is wearing me out!

I have really become comfortable with J now, and I don't know if it would help me to step into a text-based APLish language. More typing for me to do the same I do in J, and I don't code for or with others, so no collaboration arguments here. And, I can read my old J programs just fine, even years later. Most code fits in a glance, and if you are not very practiced, very little time to refresh as opposed to looking through pages of 'easy to read' Python for example.

How is Kerf [1] coming along? How similar to J?

[1] https://github.com/kevinlawler/kerf


Kerf's doing well: we're adding a sort of NoSQL functionality which is pretty unique (unstructured data you can index and join on). Closer to K than J, but really pretty different from both, beyond the Array language heritage.

I had considered building a distributed processing system using J and zeroMQ, but now considering doing something like this with Kerf. Main thing is ... do the customers want it. I can't pay rent with technological innovation.

FWIIW, Pascal has something you might be interested in for GPUing in J, if you are not aware of it; jarrayfire. https://github.com/Pascal-J/Jfire I haven't screwed around with it, but I keep meaning to get around to it. Looks solid. I made an abortive attempt to do something cheaper (cuda hooks basically), but got distracted.


Code is read much more often than it is written. That's why I prefer APL's symbols over J's ASCII clutter.


APL isn't that much more readable. They're both fairly obtuse, so it's really just whichever you learned first.


That really isn't true at all.

For someone who did not study mathematics, symbols like integral, differential or summation are meaningless and obscure. A greek letter "delta" with a dot on top only has meaning once studied, learned and used.

For someone who did not study music and musical notation, symbols like a silence, eight or sixteenth note are utterly meaningless.

However, for people who studied these subjects and devoted enough time to get good at them these symbols have huge meaning and, yes, they become tools of thought.

Spend enough time with APL --just a couple of months real effort at most-- and the symbols are not obtuse at all. They develop meanings just like mathematical or musical notation does. Combinations of symbols become words and phrases. The programmer eventually starts to recognize them instantly and think in these terms, much like someone well versed in math might recognize a polynomial and instantly visualize and understand it's shape and position in space.

If I may continue the analogy, programming symbolically with APL quickly becomes like writing and playing music. It's an amazing experience that cannot be described, it has to be experienced.

If you have not devoted a good amount of time to working with APL the above might not make any sense and might even sound crazy. And that's OK. Just understand there's a huge difference between forming an opinion while watching gymnastics vs. actually being a gymnast.

J is a huge step backwards. Notation is incredibly powerful. The decision to discard it simply because the language was ahead of the hardware of the day was, at best, flawed.


No, I understand, but J is as much a notation as APL is. It may be a different notation to what you, as an APL programmer, are used to, but it is as much a notation as APL.

While you say that my judgement is clouded by not using APL, I say the opposite is true: from here, they're equally symbolic. The difference is, one of them uses symbols I can type.


Well, we'll have to disagree then. I've had this conversation many, many times. Using a pile of ASCII characters is, objectively, a very bad idea. And, despite what you have said, it is not a notation.

Hardware engineers have actually had contact with APL without knowing it. Programmer's manuals for many CPU's use notation derived from APL to describe how the CPU executes the code. It's intuitive and easy to read.

Iverson himself used it to describe the inner workings of the IBM 360 even before APL existed. If I presented anyone with a databook or IBM 360 technical manual that used J's transliteration the very first statement out of their mouths would probably be "the printer fucked up, look this gibberish".

Again --and I don't mean this an a personal attack at all-- with zero experience in APL it will be hard for you to understand the power and value of a real notation rather than using a mish-mash of ASCII characters and attempting to ascribe meanings to them. There is a huge difference between the two.

Music notation is perfect parallel for the value and power of APL notation. I only wish it had been taken farther. Maybe someone will pick-up where Iverson left of and develop the next generation symbolic programming notation. In many ways we need to evolve past where we are today in order to gain the kind of power and expressiveness that, again, one can experience with music notation when composing and playing.

Have a great day.


A lot of people have cited the APL Life example, but I found this J example of somebody exploring Ulam's spiral as a great way that J is used to approach a mathematical concept [1]. Very easy to follow along in J open next to YouTube, and it still amazes me how quickly you can visualize from the REPL with 'viewmat'.

The Jupyter notebooks, are modern kin to this, and are still trying to catch Mathematica, which for me, really shines for toying around with the curated data and one-liners.

[1] VIDEO - https://www.youtube.com/watch?v=dBC5vnwf6Zw


APL\B5500: The Language and its Implementation

Gary Kildall

https://news.ycombinator.com/item?id=12223304


How does something like "R" compare to APL?

If you replaced the greek letters and other symbols in APL with function names, keeping the right to left order of operations, is there something missing? (I only did a tiny bit of APL for a class 30 years ago)


I've played with J a bit. I have yet to see any J code that can't be translated in a pretty straightforward way to idiomatic Python + Numpy.

Of course, it's pretty important to note that idiomatic Numpy is very different from idiomatic Python. For example, here's some idiomatic numpy:

    gamma = arange(0,1, 1/1024.0)
    x = np.outer(r(T-event_times[M+1:N]), gamma)
    log_likelihood += M*log(gamma) + sum(log(1-x), axis=0)
    return exp(log_likelihood)
Idiomatic python might be:

    for i in range(0,1024):
        log_likelihood[i] = M*log(gamma[i])
        for j in range(M+1,N):
            tmp = r(T-event_times[j])*gamma[i]
            log_likelihood[i] += log(1-tmp)
        likelihood[i] = exp(log_likelihood[i])
    return likelihood



Python has the idiom he describes as a builtin.

    with open(filename) as f:
        ...your func goes here
If you want to create your own custom one:

    @contextlib.contextmanager
    def with_a_foo(...):
        foo = ...
        yield foo
        ...cleanup the foo...
It's less terse (and in my opinion more readable [1]) than J, but same idea.

[1] I very rarely find point free programming to be comprehensible. This may be my mental limitation.


I think you mean J.

Easy: J is an abomination. Notation is a very powerful thing and J destroyed it. It was a misguided attempt to deal with hardware limitations of the time.

I used APL professionally for about ten years.

Don't take it from me, ironically Iverson himself made the case many years ealier in this paper:

http://www.jsoftware.com/papers/tot.htm


Made the case for what? In another paper, he wrote:

Since J is now free, and uses only ASCII characters, the requirements set forth in my A Personal View of APL have been fully met.

http://www.jsoftware.com/papers/autobio.htm


Notation as a tool for thought, of course!

In quoting the article you conveniently leave out that Personal View dates back to 1991 and, technically, had it's foundation in the years prior to that because of just how frustrating it was to use APL during that awkward time in computing.

I am not sure if you know APL, ever used it and for how long. I go way back. I was in the thick of it as the IBM PC came out and various APL packages tried to make the transition. I started using APL on time shared mainframes over Honeywell and DEC teletypes and Tektronix storage CRT terminals, that's how far I go.

It was down-right frustrating. On the PC, APL interpreters had to ship with a character ROM for the monitor card just so you could see the characters on the screen. In the process you'd lose about half of the ASCII table to APL. If you wanted to print APL characters you had to buy specific printers, like the IBM Selectric and install a special APL ball with APL characters. A few got clever and installed a physical toggle switch on the computer to switch between APL and ASCII character sets.

Yeah, you had to crack open your computer and change a chip just to be able to see APL chracters!

I wrote and presented a paper at an APL conference in the early 80's. Back then you had to submit the paper in print for reproduction. I had to write my own APL paper printer program to drive an IBM Selectric such that it would pause and allow me to change the printer's character ball to print code sections in the paper.

You really truly had to want to do APL in order to be willing to endure this kind of pain. The lucky ones had Tektronix storage CRT terminals or IBM terminals with built-in APL characters.

Eventually dot matrix technology made things easier. Yet, we were still crippled by the limitations of ASCII and having to live this dual life of needing APL characters to program and regular characters for the "user space" application.

That being the context, Ken Iverson buckled to the pressure and transliterated APL to ASCII characters, something that would result in J. This, of course, was extremely short-sighted and went exactly counter to the power he found on notation as a tool for thought.

A man who devoted no less than twenty years to developing the beautiful and powerful concept of a notation for describing digital systems and programming, simply made the wrong call just as technology was about to make the use of such notation simple and universally accessible to all.

J is not APL. Far from it. The power is in the notation. J throws that away. It's a shame.


> For most of us, getting access to a computer and being able to use it as long as we wish is merely a dream.

Ah, so he has kids, too.


The real question is where do I get a magic typewriter that works like that???


You can also use the Emacs mode (disclaimer: I'm the autor), which not only gives you reasonable input methods for APL characters, but also a lot of other features such a source code navigation and the ability to edit matrices in SES (the Emacs spreadsheet).

https://github.com/lokedhs/gnu-apl-mode

And a video I made to demonstrate some of its features:

https://www.youtube.com/watch?v=yP4A5CKITnM


You can try J instead, http://www.jsoftware.com/

As Wikipedia puts it [0]: it is a synthesis of APL (also by Iverson) and the FP and FL function-level languages created by John Backus.

To avoid repeating the APL special-character problem, J uses only the basic ASCII character set, resorting to the use of the dot and colon as inflectionsto form short words similar to digraphs.

[0] https://en.wikipedia.org/wiki/J_(programming_language)


Sorry, I meant the typewriter from here: https://www.youtube.com/watch?v=_DTpQ4Kk2wA


Like all the non-English speakers do.


As a young IBM engineer, seeing APL keyboards put me off programming for a decade. The incomprehensible hieroglyphics were daunting.


No, it's not. It is a horrible abomination. See my other post.

It's like changing all math symbols to words because the computers of the time can't display the integral sign or greek letters. A total destruction and negation of the power of notation. Iverson made a very bad decision with J. It's crap.


We detached this subthread from https://news.ycombinator.com/item?id=12255109 and marked it off-topic. Inclusions like “It is a horrible abomination.” and “It's crap.” are not necessary or acceptable on Hacker News.


Just saw this.

Hey, you know, your forum, your rules. So, fine with me. No issues.

However, you are getting tripped-up by language and, effectively, censoring it, which is exactly opposite our culture. And passing judgement on the necessity of language is a very subjective thing. Profanity or colorful language is able to communicate at a level that is almost impossible to match with hundreds of words.

In this case the use of the word "abomination" is objectively correct and true. This coming from someone who started using APL over 30 years ago, was very much a part of the community, published APL papers, met every luminary in the language and fully understands how, when and why J came about. Few people. Very few people on HN can match this degree of depth in APL and most voice opinions about J out of ignorance of the history and genesis of this history-changing mistake.

The only modern parallel I can find is what happened with Python 2.7 and 3.0. Not a perfect analogy because the incompatibilities did not kill Python. J all but assured that APL was dead.

Instead of sticking with and evolving the great concepts of notation as a tool for thought Iverson managed to both give the world one of the most amazing new concepts in computational problem solving and, at the same time, years later, pretty much kill it. Yes, "abomination" fits that history.

Still, your forum, your rules. Live long and prosper.


APL symbols are slighly more suggestive than J symbols, sometimes they vaguely resemble mathematical counterparts with similar meanings, but they remain entirely separate programming symbols with their own meanings. The idea of using other, more common characters instead remains an excellent idea. Afflicting J with lovecraftian adjectives and scat doesn't help.


> "It's like changing all math symbols to words because the computers of the time can't display the integral sign or greek letters."

You mean like TeX? Seems pretty fucking sensible to me.


The worst part of APL (I learned it in college) was dictating programs to colleagues.


imo using the symbols is part of the fun of APL! Apart from the utility.


Sigh, I guess you're right. It's not like their aren't massive disadvantages to the APL way of doing things, like not being able to edit your code in standard editors, and having to learn confusing new keybindings, or even get a new keyboard.

OH WAIT.


Not anymore......now that utf-8 is spreading everywhere, APL is ready for a comeback! Most editors (Eclipse, Emacs, VI, etc) can handle the characters, and most OSes can handle different keymappings, so no need to buy a new keyboard.

Of course, you'll still have to learn confusing new characters.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: