this post was submitted on 21 Oct 2023
4 points (100.0% liked)

Programming Languages

1167 readers
1 users here now

Hello!

This is the current Lemmy equivalent of https://www.reddit.com/r/ProgrammingLanguages/.

The content and rules are the same here as they are over there. Taken directly from the /r/ProgrammingLanguages overview:

This community is dedicated to the theory, design and implementation of programming languages.

Be nice to each other. Flame wars and rants are not welcomed. Please also put some effort into your post.

This isn't the right place to ask questions such as "What language should I use for X", "what language should I learn", and "what's your favorite language". Such questions should be posted in /c/learn_programming or /c/programming.

This is the right place for posts like the following:

See /r/ProgrammingLanguages for specific examples

Related online communities

founded 1 year ago
MODERATORS
 

TLDR; do you know of any general purpose languages that can also compile a function to some representation of AND/OR gates (or NAND gates, or whatever)?

Edit: actually any algebra/formal-logical system is also fine (not just boolean algebra).

Yes, a A LOT of additional info is needed, like defining how input/output is defined, and I am interested in how those would be specified. I'm not interested in printing an actual circuit, just the boolean-logic level. And I'm mostly asking because I feel like most compilers can't generate a clean/mathematical representation from their AST. There's AST to IR, there's hard-coded optimizations on the IR, and then there's hard-coded mappings from the IR to assembly, but at no point (AFAIK) is the code turned into a algebraic/logical system where something like De Morgan's Law can be applied. And that seems really sad to me.

So you could say my real question is: what compilers have a strong logical/algebraic internal representation of their own AST?

Maybe something like Haskell or Prolog do this. The Wolfram Language almost certainly does but it's closed source.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] kakes@sh.itjust.works 3 points 1 year ago (1 children)

Another problem I see is that for some machine instructions, the boolean output would be absolutely massive (not to mention redundant). Take something like a fetch from RAM for example - which happens at least once on every machine instruction. The binary description would be huge, especially if you include every "false" check.

If you haven't seen it, I highly recommend Ben Eater's Breadboard Computer series. His videos take you from logic gates up to a simple functioning computer.

If you could create/find a C compiler for this simple-as-possible computer, then in theory you could use these videos to get the full boolean logic of each instruction. Not something I would do, but I think that would be the easiest way to do it.

[โ€“] jeffhykin@lemm.ee 2 points 1 year ago

This is a really good point. If a complicated pure function is straightforward-ly converted into a boolean expression; at some point the the best way to simplify it would be making a Turing machine INSIDE the expression itself.

I was mostly thinking of small scale functions or sections of really hot/real-time code. Maybe using it for analysis for potential new/helpful instructions for an assembly language or as a foundation for highly advanced bit-level optimizations like the inverse square root hack for Quake (but automated and generic).

I'll check out that link! In my undergrad one of the classes had us make our own machine language starting from logic gates, muxers, building registers, memory, adders, ALU's, etc all the way up to a our own custom assembly language. It was probably the most helpful class in my entire undergraduate.