In this installment I would like to show you in six cases, why is Janet my programming language of choice.
I will start small: Janet has both mutable and immutable data structures for all three usual suspects. So we have a buffer (mutable) and string (immutable), also table (m) and struct (i), and indeed the array (m) and tuple (i). I can understand the argument for immutable data, and respect it, but I love the choice. Janet gives me that on every step, and yet guide me by its design.
The fundamental property of the Janet language is the low-level nature of the language implementation. C is the programming language used to implement lots of parts; its parser, compiler, the virtual machine and a lot of the functionality in the standard library.
This choice not only makes Janet fast and efficient, but it also brings inspectability and transparency of the underlying mechanisms. Another benefit is that there is the header file for the Janet C API, with which it is not that hard to build your own C code or wrap some existing C code into your library or embed Janet in other C program.
In the spirit of the best Unix programs printing to the stdout is the first-class citizen in the Janet language. For example, templating language Temple by the creator of the language just prints. In the formatting part of the Spork's library base function prints. The standard library has many functions for better work with the printing, e.g. redirecting output.
The threads are also part of the core of the language. You can say what you want when you should use the threads but sometimes it is very convenient to spin up threads for the hard work. Especially if the threads are more similar to the erlang processes than to the standard OS threads. Still, you have to use them safely, because of the race conditions and locks.
Now you have to think I have gone crazy. OOP in the functional language, isn't it anti-pattern? I do not think so. Once I have been to talk which compared all three main paradigms. I took away that every paradigm has its place and time for usage.
The OOP in Janet is the prototype-based one and backed by the table data structure. You can easily set the prototype for the table with some properties or behaviour, which is then inherited by your table.
The syntax for sending messages is easy. Call the keyword with the object you are calling the message on as the first argument. This syntax is often confusing for the people coming from the Clojure, because they can be used to get the value from hashmap this way.
One exciting possibility emerging from this concept is that you can easily teach messaging to your JanetAbstract objects written in the C language. Then you have programs written in C looking just like ordinary Janet objects.
As you may remember from the How I became Janet post I was avid Rubyist. One of the quotes from Matz I like a lot is:
There are only three things certain in programmers life: taxes, death and parsing text.
For this purpose, Janet uses PEG. If you are not sure, read the article to understand the concept better. With it, you can create a small state machines, which can scan, parse and transform the text into another form of the data.
What I love a lot about this particular implementation? That it is part of the core of the language, again, you guessed, written in C. The PEG definition's syntax is pure Janet code, very similar to how you write macros with all the quoting, quasiquoting and unquoting. Its learning curve is steep, especially if you strive for very optimal definitions, but once you get feel of it, you will never look the same on a string of text. And you will finally understand why you hated regex so much.
One use case, where PEG shines for my particular interest is parsing the interactive commands user input, mostly in TUI applications. I will discuss this more in one later post, where I will write about my projects.
The environment is just a Janet table, that holds all loaded bindings. The one called root is used as a base, when you want to create a new one, for example for the new fiber, you are creating, or when you are compiling your code by hand.
As is a custom in many languages, Janet has modules system to separate concerns and divide code into smaller parts. By default, one source file maps to one module and contains bindings and the environment. By importing a module, you bring these to the current environment. But there are many other mechanisms, how you can make this your own, some of them I will show in this part.
The loader is a mechanism that imports modules into the environment. Interestingly, you can have loader not just for the Janet code, image and native modules, but for whatever syntax. One lovely example is the Temple library, which uses them to load your HTML (or any other) templates.
The concept is very similar to the coroutines in other languages with one caveat: it is also one of the base building blocks of the language and the virtual machine. The fact also supports this: fiber contains a mechanism for signalling, which helps differentiate the fiber's return, called yielding. One of the usages touted by the language creator is capturing error in the fiber's code, returned to the calling fiber. Yes, you are always in the fiber, when you have Janet code running.
Right now the development version of the language contains generators, based on fibers. Generators are similar to the Python ones, but due to the status of the fibers as the first-class citizens of the language, they are more natural in my view.
Experimental and developing feature of the language added just in the latest minor update, but great potential for the future. Maybe you already know what it is: event loop in the core of the language! Just say how cool is that, and with fibers as building blocks, it shines for anything async you throw at it (if it does not block that is). Even as I am very excited about this feature, I would rather wait for a little for it to mature and show all its facets.
I know what you think; pepe just got out of ideas. No, I do not want to talk about the fact, that Janet compiles the source code into virtual machine bytecode, what I want to convey is that there is absolutely nothing wrong for you as a programmer to grab some AST (probably provided by the language parser) compile it and run compiled code. Or use some of the higher-level mechanisms already present in the standard library as dofile or require.
I just used my minimal knowledge of programming language design to develop Janet linter for the Kakoune editor, which probably is not the best linter around, but works great for me.
And not only code but even PEGs you can compile for faster and more efficient run, when their time comes.
As any other Lisp, and many other languages Janet has macros. With all said above, I am not afraid of writing them and even use them in a way, that will up many brows, I fear. Yes, we can use macros! Deal with it.
As you may understand now, tables are everywhere in the language OOP, environment and other places, and frankly quite common in other langs. So? In my opinion, Janet's ones are top-notch, similarly to fibers, because they are the very basic building block of the language design.
Standard Janet repl has already coded a lot of shortcut keys for navigating and modifying input text. One of the good outcomes of this feature is that all those are also present in standard library getline function. But for me, there are two I use a lot and refuse to switch to remote repl from the editor:
tab completes the binding you have started, even showing you the options, when there are more than one.
ctrl+g shows the documentation for binding under the cursor! You may know that thought: is the separator first or the last parameter to this fn?
And both are present when you use standard library getline; you provide the table with bindings.
In the standard library, there is a macro loop, which is similar to those in other Lisps. It also has sister macro seq, which gathers all the results from all runs. I do not use it very often, but when I do, I praise it.
That's all folks for this part. Next will be again little bit more philosophical.
To understand all this better, be sure to read the first installment.