I JUST WANNA HAVE INTERFACES IN MY PROJECT, I GOTTA USE PYTHON SINCE I'M SCRIPTING FOR ITERM2, WHAT DO YOU MEAN I HAVE TO IMPORT THE ABILITY TO INTERFACE :kitty-cri-screm:
That's what you get for using OOP, the most bourgeoisie of all programming paradigms. I refuse to elaborate further.
There's a reason they call it a "functional" language, because all the other ones are just form :frothingfash:
Someone once said that Python isn't an object oriented language it's just a language with some object oriented features.
I love the power of the language but it has completely fucked up the mind of everyone who learned on it as they try to make arrays with multiple data types and act confused when it doesn't work.
Yeah, there’s a reason people talk about “pythonic” code. Because it’s all designed around duck typing and hash tables and dunder protocols, so if you try and just port your OOP patterns over to it 1-to-1, you’re gonna have a bad time. But honestly list comprehensions are so cool so 🤷
Don't get me wrong, it's fucking beautiful at what it does and is a great programming language. I just don't think it should be taught until Junior year at the earliest.
Maybe I'm a hardass because I grew up on C and C++, but I still think programmers should learn how programs work at the assembly level and up.
Yeah I think our software dev education in general is really out of order. I think we avoid teaching the mental model of how computers actually work a lot of the time because it’s so much more complicated than in the days of the Commodore 64. Microcode, pipelines, virtual memory, cache misses, memory locality, etc have made our CPUs so much faster, but have also added a bunch of pitfalls to explaining exactly how things work. I’m somewhat hopeful that some of the newer trendy languages like Go or Zig might make for good first-time languages. Go in particular, you can get up to speed so quickly.
Go in particular, you can get up to speed so quickly.
Oh my goodness yes. Amazing language. I grew up on C (and to a lesser extent C++). Go was so wonderfully easy to learn. I picked up the basics in literally an evening, and was making useful utilities in a weekend. The memory safety, static typing, easy and true concurrency, massive standard library, the fact that it is a compiled language that outputs a static binary and doesn't depend on the destination machine having the exact same interpreter version and imported packages as your dev machine, I love it. It is a headache-preventing language. I still write a little C++ but just for microcontroller projects, things using Arduinos or ESP32's and the like.
I've got issues with using dynamically-typed languages for anything but the most basic scripting tasks. You just don't know when the whole thing is going to fail catastrophically on weird inputs.
I think we avoid teaching the mental model of how computers actually work a lot of the time because it’s so much more complicated Microcode, pipelines, virtual memory, cache misses, memory locality, etc
Tell me more/where can I learn more
The old model
Short version
You have a CPU and a bunch of memory. The CPU has a program counter. It points to the memory address of the current instruction. The instruction gets loaded in. The CPU follows the instruction. And then depending on what the instruction was and how it went, it would either add one to the program counter or set it to a specific address (that’s called a jump).
Long version
You have:
- a clock—sends a pulse at some regular interval
- a bunch of memory—usually RAM, can retain its values as long as the computer has power, each byte usually has its own address
- a program counter—holds the memory address of the current instruction
- buses—circuits to transfer bytes between different modules
- registers—“working” memory
- Arithmetic/Logic Unit—Part of the CPU, cool math stuff
- Control Unit—Part of the CPU, decides what to do next
You start up the machine, the program counter defaults to some preset value (usually 0x00). The instruction at the program counter is loaded into the CU. The CU decides what needs done. It could load something into a bus from memory, push some value into a register, instruction the ALU to add two registers together, etc. These are all system-specific possibilities and can vary pretty greatly.
The CU may take a different number of cycles depending on the instruction, but at the end of each instruction, it will either increment the program counter by 1 or it will jump to a specific address based on the outcome of the instruction. Some instructions will always result in a jump. Some never will. Some are conditional. That all depends on the machine you’re writing for.
This all repeats over and over until the instruction that gets loaded in is a halt instruction. And that’s it.
Cool videos to learn more
- Ben Eater’s 23-part series where he designs and builds a custom computer from scratch on breadboards.
- LostT’s DEF CON 24 talk titled “How to build a processor in 10 minutes or less”
- Matt Parker’s Numberphile video where he builds computer circuits from dominoes to show the fundamental ideas there.
- Matt Parker domino computer, but bigger
Stack and heap
The stack and the heap aren’t part of the modern complications that I mentioned and are way too old to say otherwise. However, if you go back to those old school personal computers, having dedicated stacks wasn’t always a given.
This StackOverflow post is a decent explanation of what they are and how they work.
Modern complications
It used to be that each instruction took a preset amount of clock cycles. But that would leave some modules doing absolutely nothing for extended periods of time. So instead, if a CPU could prove that, for example, a certain section of ALU code would run the same way no matter what the previous code did, it would run it early and let the ALU work on it while the previous code finished. And this worked even better if code wasn’t necessarily executed in the exact specified order, but instead in an equivalent order. This is the basis for pipelines.
Now, as people who maintained instruction sets grew their instruction sets over time, they found that it was often easier to, instead of create new circuitry for every single new instruction, just translate new instructions into a series of simpler old instructions. This would be slow to do with every instruction, but modern computers load a bunch of instructions at once and then cache their results. This whole situation is called microcode. Your instructions can technically get translated into an arbitrary set of largely undocumented instructions. Usually it’s nbd and makes things faster. Sometimes, it involves accusations of the NSA adding back doors into your encryption instructions.
Memory virtualization, I think, is an OS thing that I don’t super understand but it basically means that memory is no longer an index into the giant array of memory that is your RAM.
Memory locality and cache misses are things that affect speed. Basically, your CPU loads memory in chunks, so if you can store related memory close together, it will cut down on reads because those addresses will be in cache. This seems like a small detail, but it’s really hard to understand how much accounting for this can speed up your code.
Pick up a used copy or free pdf of Patterson and Hennessy’s book and/or go through Nand to Tetris
I think programmers ought to learn to use oscilloscopes and to roll their own condensators and electron microscope write their own transistors, of course after mining the materials for those themselves.
Someone once said that Python isn’t an object oriented language it’s just a language with some object oriented features.
The OO purists use this to describe basically every language.
Good think I'm a functional programmer who uses Clojure. Sucks to suck I guess. 😎
import random
[random.choice([str(i), int(i), float(i)]) for i in range(10)]:sicko-hexbear:
Python is bad. It's not object oriented, but still has classes for some reason? Zero type checking until it explodes on runtime. Slow as hell. But it can do two things really well: handling REST api calls and manipulating datasets. For anything else use another language.
It does not implement a large amount of object-oriented design specifications, namely those related to encapsulation and polymorphism
You want abstract base classes. That’s how you do what you want to do in Python.
import abc
To to anything in python you'll probably need to import 1 trillion libraries, it's just what it is.
Is that normal or easier than the python thing?
Also, I wish people would chat to me about stuff I actually wanted to talk about, but they only respond to contentious stuff on topics I know little about. :(
Writing your own vtables and function pointers in general can get quite messy in C if youre not careful. Very not type safe, not usually a go-to solution
As for python i'm not sure since its not a feature i really use much, since python has duck typing you can get away with not using interfaces
Also i feel you :deeper-sadness:
Idk, stuff in other posts. I had a wargame comment and a family history comment. Feeling a bit mopey, I guess
Hey I'm the person you responded to with the Wargame comment.
It's not really something I'm familiar with so I didn't comment and since it was pretty late after most people saw the thread (which you said would probably happen) more people didn't see it. I was hoping other people would engage but the time-based constraints of visibility on this platform (unfortunate) did hurt the chance for people to see it.
I bet if you posted that comment as a new post on c/games that people who could engage with it would because I'm sure there are more people here with interest in that kind of stuff.
In short you rock sorry I'm not the right person to respond to that type of thing and thank you for taking the time to type it out. I hope you do make an independent post on it because I do think you'll find people who are interested in it.
I love how even if you go out of your way to type hint variables the runtime won't even tell you if it's not according to your type hints :blob-no-thoughts:
they're like 'type hints? oh, fancy comments' :blob-no-thoughts:
yeah, python is dynamically typed, the hints are there for linterss and the like
I LOVE DYNAMIC TYPING :hypersus: I LOVE DEBUGGING RUNTIME ERRORS :torment:
you have two choices:
- the compiler goes "good job dipshit, you fucked it" for 2 hours until it allows you to run code
- the interpreter goes "looks good to me" and then immediately explodes for 2 hours until you figure it out
9/10 times i vastly prefer the former because when my computer yells at me for being a stupid fuck it's because i'm being a stupid fuck and that's valid to yell at me about :blob-no-thoughts:
and the 1/10 times i'm just scripting and don't really care if I'm fucking shit up because i'm only telling it to modify, like, my clipboard or something. but, like, that's what bash is for
Nah if it takes you 2 hours just to get your code to compile, I promise you it'll take at least 4 to get it running correctly in an interpreted language. And you better hope that your test cases cover all paths and you don't have any runtime errors lurking that you just didn't notice.
with "for 2 hours" i meant that you continually recompile until it works, not that it takes you 2 hours to compile it
I use dataclasses and NamedTuples a lot because they actually do runtime type checking
I know "just add another tool to your toolchain" isn't a great solution but mypy slaps and has IDE/editor plugins
what's an interface in programming :blob-no-thoughts: (im 3 years into studying compsci lmao)
They’re kind of like a contract that says, “I have these methods and/or properties and that’s all you know about me”.
Imagine you have a bunch of places in your code where you have a list of things that might be empty and you only want to get the first item out and throw an error if it’s empty. You could write a helper function for that:
function pop_or_throw<T>(items: T[]): T { const item = items.pop() if (item == null) throw new NotFoundError() return item }
But now you realize you actually do the same exact thing with a bunch of maps. You pop an item off and if it’s empty, you throw an error. So to make this helper function usable everywhere, you can write an interface:
interface Poppable<T> { pop: () => T | undefined } function pop_or_throw<T>(items: Poppable<T>): T { const item = items.pop() if (item == null) throw new NotFoundError() return item }
That’s the gist of what interfaces are for. They allow you to think about exactly what your code requires of the things it references and not worry about the specifics of everything else, which often makes it more flexible.
So if you’re writing a function that accepts an object, instead of specifying that object, you can write an interface that only specifies that methods or properties you plan to use. That way, if you make changes to the object or need to switch to a different object, you can leave your other code intact.
They don’t always make sense for small projects, but for large projects they help maintainability a lot. Also, they help when you’re first reasoning through a problem because they force you to minimize what parts of the rest of your system you have to think about while working on a particular part. I’ve seen it talked about like, “when you’re designing a lego, you don’t think about all possible other kinds of legos, you just imagine the connector circles on top”. Those connector circles are the parts that are shared among all of the pieces you care about. It’s your interface.
it's an abstract class without any defined functionality, which allows for use of implementation of multiple on a single class. Basically, what it's saying is 'hey, everything that uses me has to use these functions taking in these inputs, but each of them might do something different with the input, but you can still mass invoke that named function if you need, say, something to loop over everything relevant that implements me and do something with something. I'm also a way of self-documenting responsibility cleanly at the top of a class, because I'm a contract that is designed to simply yell at you if you don't implement your declarations"
lets say you have a button. if you are in front of alice, pressing this button does x. if you are in front of bob, pressing the button does y. If alice and bob both implement the IButtonPressObserver interface with the declared abstract method OnPress(), you can write one function that takes in an IButtonPressObserver in front of you and tells it to run their respective OnPress method instead of one function that takes in alice and one function that takes in bob
Python is a weird language. At its core its the absolute bomb for quick scripting and lazy coding to try things out and I absolutely love it for that. Pretty sure thats what it was invented for.
On the other hand it has a bunch of half assed or outright shitty features (I'm looking at you multiprocessing and GIL you absolute bane of my fucking existence)
I blame the people who toom python and then tried to make it do things it wasn't supposed to cos they were too lazy to learn a different language. Same goes of JavaScript people making nodejs and now I have to contend with frontend devs who now suddenly think they're backend devs implementing some of the most dogshit backend services i have ever seen.
imagine if people did this with shell scripting.
"hmm yes lets use bash for core application architecture and guis and websockets and async and oop" -- ramblings of the unhinged
That's every systems engineer I know. I know one who programs in powershell lord help us all.
I, too, was guilty of that. Got a bit better, but bash looks so familiar after 40+ years of using it.
You forgot about all the visual basic using ones out there as well lol
You can make powershell look very shell-y (
sh
,bash
, etc) with it’s built in aliases
I blame the people who toom python and then tried to make it do things it wasn’t supposed to cos they were too lazy to learn a different language.
I suddenly feel the urge to post a link to Home Assistant for some reason.
I mostly kid. I actually quite like HA. It's a great way to de-cloud home tech. But it has some quirks due to the developers choosing Python.
Edit:
Same goes of JavaScript people making nodejs
Oh god yes. Any attempt to point out Javascript's many, many, many performance flaws just results in the Nodejs crowd saying "but the V8 runtime fixed that!" as if it's a point in Javascript's favour. When it takes hundreds of some of the most talented programmers many years to make the language not-quite-too-much-slower than regular compiled languages, that's not exactly making the language sound appealing.
I JUST WANNA HAVE INTERFACES IN MY PROJECT
use duck typing, the python way
(yes i know it's horribly type-unsafe, that's just how idiomatic python is)
Python does have interfaces, it just doesn’t call them that.
In Python any class can inherit from as many other classes as you want it to and you can make a superclass abstract using abstract base classes.