Urn: A Lisp implementation for Lua

Urn is a new language developed by SquidDev, and demhydraz. Urn is a Lisp dialect with a focus on minimalism which compiles to Lua.

What?

  • A minimal¹ Lisp implementation, with full support for compile time code execution and macros.
  • Support for Lua 5.1, 5.2 and 5.3. Should also work with LuaJIT.
  • Lisp-1 scoping rules (functions and data share the same namespace).
  • Influenced by a whole range of Lisp implementations, including Common Lisp and Clojure.
  • Produces standalone, optimised Lua files: no dependencies on a standard library.

¹: Minimalism is an implementation detail.

http://urn-lang.com/

Advertisements

Building Business Systems with Domain-Specific Languages for NGINX & OpenResty

This post is adapted from a presentation at nginx.conf 2016 by Yichun Zhang, Founder and CEO of OpenResty, Inc. This is the first of two parts of the adaptation. In this part, Yichun describes OpenResty’s capabilities and goes over web application use cases built atop OpenResty. In Part 2, Yichun looks at what a domain-specific language is in more detail.

You can view the complete presentation on YouTube.

https://www.nginx.com/blog/building-business-systems-with-domain-specific-languages-for-nginx-openresty-part-1/
https://www.nginx.com/blog/building-business-systems-with-domain-specific-languages-for-nginx-openresty-part-2/

Tamale – a TAble MAtching Lua Extension

Tamale is a Lua library for structural pattern matching – kind of like regular expressions for arbitrary data structures, not just strings. (Or Sinatra for data structures, rather than URLs.)

tamale.matcher reads a rule table and produces a matcher function. The table should list {pattern, result} rules, which are structurally compared in order against the input. The matcher returns the result for the first successful rule, or (nil, "Match failed") if none match.

https://github.com/silentbicycle/tamale

Tutorial: Deep Learning in PyTorch

This Blogpost Will Cover:

  • Part 1: PyTorch Installation
  • Part 2: Matrices and Linear Algebra in PyTorch
  • Part 3: Building a Feedforward Network (starting with a familiar one)
  • Part 4: The State of PyTorch

Pre-Requisite Knowledge:

  • Simple Feedforward Neural Networks (Tutorial)
  • Basic Gradient Descent (Tutorial)

Torch is one of the most popular Deep Learning frameworks in the world, dominating much of the research community for the past few years (only recently being rivaled by major Google sponsored frameworks Tensorflow and Keras). Perhaps its only drawback to new users has been the fact that it requires one to know Lua, a language that used to be very uncommon in the Machine Learning community. Even today, this barrier to entry can seem a bit much for many new to the field, who are already in the midst of learning a tremendous amount, much less a completely new programming language.

However, thanks to the wonderful and billiant Hugh Perkins, Torch recently got a new face, PyTorch… and it’s much more accessible to the python hacker turned Deep Learning Extraordinare than it’s Luariffic cousin. I have a passion for tools that make Deep Learning accessible, and so I’d like to lay out a short “Unofficial Startup Guide” for those of you interested in taking it for a spin.

https://iamtrask.github.io/2017/01/15/pytorch-tutorial/