Warning: Attempt to read property "comment_date" on null in /var/www/wptbox/wp-includes/comment-template.php on line 1043
Warning: Attempt to read property "comment_date" on null in /var/www/wptbox/wp-includes/comment-template.php on line 1043
Warning: Attempt to read property "comment_date" on null in /var/www/wptbox/wp-includes/comment-template.php on line 1043
These four languages form the basis of modern programming. You should know at least two of them by now.
the only thing lisp formed the basis of was other lisps. The other three (and algol) got a lot more integrated into a common theory. That's why there's no common lisp jobs
http://www.paulgraham.com/diff.html
every latte-sipping, dynamically typed, trendy scripting language that is created brings us closer to LISP. people just prefer syntax over macros.
I had to learn C because it's what college teaches us here.
You know that js(the nagger tranny language) core features are based on Lisp, right?
C-chads won
had to learn some C and Lisp for my college class 🙂
I had to learn C in college, and I'm an emacs user. Am I based, anons?
I "learned" Objective-C, Common Lisp, Smalltalk, Clojure, and some other languages that pop up thanks to retards who think "scripting languages" should be a thing. I only use Clojure and Common Lisp nowadays, but of course I occasionally deal with retarded shit like python because Ansible and related tools exist, sadly.
I don't really use C or Objective-C nowadays. I can read C code but I see no point when most of my projects are heavy bit twiddling and math, which can be done with much more convenience in Common Lisp for me. Picking up Fortran might be fun, but I also have zero motivation to do so.
>I learned objective C
Can you make iphone app without using xcode?
There's literally nothing to learn or know in C. It's a very small programming language. Dynamic memory allocation, pointers, learning how many bytes is what (char is 8). And even then that shit is different on every system. You could learn C in a day.
>learning how many bytes is what (char is 8).
Clearly you should have spent more than a day learning C
Yeah it's 8 bits one byte im a retard
>char has 8 bits
Never assume anything regarding C
https://begriffs.com/posts/2018-11-15-c-portability.html
He did say
>And even then that shit is different on every system.
See also https://faultlore.com/blah/c-isnt-a-language
The way I like to say it is "if your language is portable, your code isn't." And I mean academically portable, so python, java et al don't count. Rust has the advantage of compiling to machine code but also having a defined, specked ABI. Of course it has a dump truck of other problems but whatever, everything is worse than just not using a computer
>C is really easy
>a char is eight bytes
This kind of thing is what keeps me coming back here.
>(char is 8). And even then that shit is different on every system.
char is always 1 byte on every system
Scheme
C
JavaScript
Chapter 4 query language of Abelson and Sussman
forth and smalltalk are pretty much irrelevant. forth moreso but even smalltalk wasn’t too influential as almost all of its arcane features haven’t been implemented in any popular languages. whereas with lisp the only things that haven’t are macros, homoiconicity, and s-expressions, but really that’s pretty much just one thing.
wasn't smalltalk the OOP lang?
OOP existed before, but smalltalk is pretty much the purest form of OOP. But smalltalk does some very radical stuff that no popular OOP language today like javascript, java, or c++ does, and those languages would probably have developed similarly without smalltalk
C (for its syntax and general ”structured” model) and lisp have completely changed the computing ecosystem, and pretty much every modern language today would be different without them.
forth is completely irrelevant
> I don't what I am talking about: the post
>trusting wikipedia over common sense
if you've ever used smalltalk you'd know it's completely different from java, objective-c, python, etc. java is much closer to C than smalltalk, and hell even objective-c is much closer to C than smalltalk
Smalltalk literally invented the term OOP, retard.
Simula had record classes before, but they didn't call it "OOP".
C didn't invent nor popularized its structure. In fact BCPL is almost entirely a superset of C.
The syntax comes from algol, the structure is even far older than that. Modern programming would only be different without C because C was pushed by corporate interests and caused a dark ages of computing from which we haven't yet recovered. Aside from that, plenty of other languages, especially older than C, had similar syntax and infinitely better features (including pascal, simula, algol, and their descendants such as B and BCPL, but even things like BLISS).
Forth is still use in embedded to this day (double-digit %) and was the most popular embedded language up to the mid-2000's. It's also the basis of PS and PDF. Every performant language VM (e.g. java's) uses the stack machine model, inspired by forth (even though the concept was older).
Algol got it right the first time C updated it.
Algol got it wrong, but far less wrong than C did. C gutted it.
>Forth is still use in embedded to this day (double-digit %) and was the most popular embedded language up to the mid-2000's.
[citation needed]
See ESC, 1991. Vol2 Page 790 Forth and Embedded Systems by Woehr.
By 2002, it was reported that only about 2% of embedded devs used forth regularly according to Embedded.com. I had recalled this point was reached more around 2007 but oh well.
Other interesting anecdotes:
Forth was the first resident software on the new Intel 8086 chip in 1978, and MacFORTH was the first resident development system for the Macintosh 128K in 1984.
Atari, Inc. used an elaborate animated demo written in Forth to showcase capabilities of the Atari 400 and 800 computers in department stores. Three home computer games from Electronic Arts, published in the 1980s, were written in Forth: Worms? (1983), Starflight (1986), and Lords of Conquest (1986).
The following aircrafts used forth running natively on an RTX2010:
Advanced Composition Explorer (ACE)
NEAR/Shoemaker
TIMED
Rosetta's lander - Philae
>double-digit %
If you're going to include C in that image, you should replace Smalltalk with C++ and Lisp with JavaScript.
>C (for its syntax and general ”structured” model)
Algol, Fortran, and PL/I are a lot more important than C. The only thing C did was popularize null-terminated strings.
>The only thing C did was popularize null-terminated strings.
Which was a horrible mistake, to the point even modern C programming guidelines require the use of string libraries that use double-pointers or length+pointer instead.
>modern C programming guidelines require the use of string libraries that use double-pointers or length+pointer instead
Such as? Which guidelines and which libraries?
Sorry, I only know FORTRAN and it's enough for me.
Replace l*sp with LC
Forth is utterly irrelevant and influenced nothing important. Replace it with Prolog.
Replace Smalltalk with PLITS.
>seething anti-lisp fag refuses to acknowledge where literally everything good in programming came from
>common sense
unfortunately, history doesn't care about your personal opinion
>everything good in programming came from
I just said, lambda calculus.
Lambda Calculus is a model of computation, not a programming language, poser.
By this logic, we should also include Turing machines.
>Lambda Calculus is a model of computation, not a programming language, poser.
Lisp is an implementation of it, not sure what you're on about.
There is no Lisp without LC
Anon...
Is calculus also one of the most important programming languages because it influences all of them?
Don't make a bad strawman because you made a fool of yourself.
Either directly refute my statements or stop responding.
Lambda calculus just isn't a programming language.
It's a theoretical model of computation.
Yes, everything is built on something previous. Programming languages are built on theoretical models. Houses are built on foundations. Foundations aren't houses, and theoretical models aren't programming languages. You're trying to reverse inheritance.
>Forth is utterly irrelevant and influenced nothing important.
postscript and PDF as well as Open Firmare CLI.
>Forth is utterly irrelevant and influenced nothing important.
Bitcoin signing scripts literally use a restricted version of Forth.
However, Prolog is indeed more deserving than Smalltalk, which was really just a rebranding of Lisp LOOPS.
Prolog is more deserving than C but nothing else. Prolog is used nowhere, its system was not new at inception and was easily added (even existed in some parts) in other languages, etc. It's more irrelevant than your mum's husband.
i am a retard. can someone tell me why elixir reminds me of smalltalk?
Is Forth worth learning? why?
stack languages can be interesting.
Especially if you like extremely minimal environments. As Forth was popular in the 80s for this.
http://collapseos.org/
Alternatively check out Factor, which is a more "batteries included" Forth/stack language.
https://factorcode.org/
It's worth learning to understand a completely different model of computation (stack-oriented) which has many very interesting characteristics (forth parsing is downright trivial and requires no analysis at all, forth features mixed compile-interpret operations which enable a much stronger form of macros WITHOUT any special syntax, that is more akin to compile-time evaluation, stack operations are a pain but have several implications on syntax (non-)requirements which makes implementing a forth on a new platform trivial, etc.). It also has interesting aspects like being typeless (not dynamically typed -- all data is just "bytes"), which changes how you reason about writing your program.
I only know C
>no APL
While I don't use them regularly, I'm pretty comfortable with C and LISP. I learned C through courses, and Scheme from the SICP lectures. The latter really elevated my programming abilities and changed how I did things in other languages. This was even after years of learning and personal + professional experience.
I've always been curious about the other 2 though.
Array oriented programmers intimidate me. They're the real wizards.
Those languages, while concise, seem very write-only. I still wish I was on that level.
>Array oriented programmers intimidate me. They're the real wizards.
>Those languages, while concise, seem very write-only. I still wish I was on that level.
https://beyondloom.com/blog/denial.html
APL is unreadable trash no matter how you spin it. It is responsible for starting the whole "hehe, what does this line do?" cancerous programming culture.
Dijkstra was right to hate it.
obligatory
That seems hard to read but when you consider that the equivalent in another language would be hundreds of lines long it's not as ridiculous.
APL programmers enjoy being able to see the entirety of the code they are working on at once without having to jump around in the source code, and they would claim it makes it easier to analyze the program as a whole for potential improvements/re-writes.
Direct APL descendants are pretty niche today, but APL influenced a surprising amount of languages including NumPy, Matlab, Mathematica, C++.
And a lot of languages have features that were first introduced in APL without even realizing, things like accumulate/reduce and partial-sum/scan.
Even the standard mathematical notation for floor and ceiling originated from APL.
I wanted to learn Smalltalk to have a good view of what OOP it's supposed to be (I think Smalltalk it's the language for that).
Should I use GNU Smalltalk?
I couldn't find the original compiler for Smalltalk 80, didn't search hard enough tho.
What does LULZ think?
Wrong.
I know and am comfortable working in "C" and assembler. The rest, is just bloat.
FORTH LOVE IF HONK
Is there a language like Smalltalk but with an existing job market?
I always liked Smalltalk and working in something that's similar would be comfy.
No, jobs suck.
smalltalk died for good reasons, it was slow as balls (like twice as slow as however slow you think it was) and outside of GNU smalltalk it was all image based which has serious fundamental issues out of niches like relational databases
if you want to learn what OO done well is like just learn ruby, aka smalltalk and perl's fun but autistic child
vw smalltalk was very fast. Modern smalltalks are slow as balls though, that's true. Like python-slow. What you call image-based is not hte problem. The problem is that it was tied to its IDE (which was basically an integral part of the language). The advantage is that you can modify the IDE (which is literally a part of your program) to suit your workflow needs (for example you're working on a game? You can add a model viewer *in the IDE* just using your own code AND ship this modified dev env PLUS code PLUS current program state to someone else for collab, how stupid cool is that?), but mostly disadvantages (can't easily file out without a special function, unclear application total state, you CAN break the image so that it fails to load, and due to the above you can't easily repair it, etc.). However one doesn't necessarily imply the other since you can use a file source for image-based development, plus appropriate tree-shaking facilities, as lisp allows.
>However one doesn't necessarily imply the other since you can use a file source for image-based development, plus appropriate tree-shaking facilities, as lisp allows.
no shit, my point was that it was ALL image based from planning to prod, not just in your dev environment. lisp (aside from a tiny number of fully image-based implementations) found a nice middle ground which is why it managed to cling to life just long enough for clojure and the emacs ecosystem to revive it
still though i will admit smalltalk is way more fun to futz with as a toy
Clojure isn't Lisp.
What is it then?
Java with more parentheses.
>outside of GNU smalltalk it was all image based
This is the one glaring problem with Smalltalk and I'm amazed that so few people even mention it.
One of the primary ways to manage software complexity, regardless of domain or technologies used, is code reuse. Most of the programs we write incorporate or interface with existing code, so that we don't have to write it again.
The heavily image-oriented nature of Smalltalk makes this extremely impractical and sometimes not even possible. Each program becomes almost completely isolated from the rest.
While Smalltalk is in some ways more elegant, Lisp got this aspect right: it is still somewhat image-based, but it does not suffer from this problem since it provides a standardized mechanism to load existing code at runtime, located somewhere else. While you can deploy an image, you can simply share libraries in source code form alone and let the implementation load them.
Being image-based wasn't much of a problem when the only image you would ever have was your operating system. The isolation is more a consequence of modern computers and operating systems using separate address spaces for every process and the single world approach being a bad fit for it.
> This is the one glaring problem with Smalltalk and I'm amazed that so few people even mention it.
Because it's a retarded statement
Image based development allow to modify instantiated live objects, and extremely iterative development. It's actually one of the best thing of smalltalk dev, you can query state of objects and do a lot of thing
If you want to reuse code, just load the image
Mezzano is like this. It's all image-based and booting just loads whatever you saved last.
So were the lisp and smalltalk operating systems that ran on their respective symbolic hardware engineered to run their languages natively.
Smalltalk machines notably introduced the GUI. Comb your mind for a bit for what the XEROX engineers meant when they told Steve Jobs that there was more to it than the mouse and the cursor.
Can mezzano even be installed on bare hardware ?
Yes.
>no erlang
Clearly written by a cs undergrad in mumbai
Forth is giga chad, but idk if it forms the basis of modern programming. It's ahead of its time.
>It's ahead of its time.
It's just a stack machine, what the fuck is "ahead" of what?
image-based, mixed compile-time and compilation+run-time, macro-less super-macros.
>image-based, mixed compile-time and compilation+run-time
They are approximated in other languages like lisp and smalltalk
> macro-less super-macros.
I don't know about them but at a quick glance it's not extraordinary either
The only "remarkable" thing is the iterative development, but it's not something other languages haven't already tried
The only notable (and probably the solely) use case of forth is openfirmware as today
The perception of being "ahead of its time" comes from the fact that it's not used and thus experienced its limitation in real world scenario, just like lisp machines and such
>approximated
No. Some languages like lisp and smalltalk have full image-based programming, it's not an approximation.
Nothing else has the same kind of mixed compilation+runtime system that forth has, which is necessary for its ability to do macro-less macros.
>it's not extraordinary either
lol, lmao. By glance you clearly mean guess.
Forth was used for plenty of huge projects and, just like lisp beyond its use for OSs, it repeatedly showed its benefits with few downsides. Importantly, forth is fully capable of low-level programming (moreover, on a level even C isn't capable of), while retaining features of such a high level implementing a full smalltalk system on top of forth is downright easy.
Cam you link to something you've made with Forth?
It's proprietary so no. Ask my employer to release our code.
>forth,lisp ,smaltalk
Simple well designed languages each nice in its own way
> c
random ad hock syntax ,retarded build process ,Everything as unintuitive and complex as it can be . Of course this is the language that gets picked as the one OS will work in and its retarded completely arbitrary syntax becomes a standard for new languages .
I only know intermediate ActionScript 2.0 and some basic ActionScript 3.0, sorry chuds.
emac