Grew up with this guy. We both did community college courses on programming before we were out out high school. I'm impressed with the library he's built up and CCs tend to be more pragmatic than UCs in California.
This isn't a plug/whatever - just good content from an old friend.
> Reading code, mostly. Tracing data through five layers of someone else's design choices. Forming a hypothesis about why the bug is happening, then testing the hypothesis, then narrowing. Recognising that the function in front of you is too big and asking what part of it has its own reason to exist. Recognising that the schema in front of you encodes a decision someone made in 2019 and that the decision is now load-bearing for things they did not anticipate. Knowing which of the five tempting cleanups in the file is going to bite you in production and which is safe.
It always struck me as strange that universities never had a course that would teach open source code. As in: grab a repo of a popular open source project, read part of it and do your best to create a contribution in it.
The lectures should be about different open source projects and their design choices.
Sorry I am pasting my old comment here, but the intention is same
Before learning programming one should know what is computing in general? It sets good mental model, after that you can easily pickup and start writing program yourself .
Data, data, data :))) Some basic notions to know:
Input → Computation → Output
Information is omnipresent (this is just an intuition, not a claim). It serves as both input and output.
Computation—also known as a procedure, function, set of instructions, transformation, method, algorithm, or calculation.
In my early days, I ignored the fundamental notion of data and procedures. But eventually, it clicked: Programs = Data + Instructions
Watch Feynman on computing—he even starts with the same concept of data, introducing computers as information processing systems. And processing requires algorithms (i.e., instructions or procedures).
Programming is simply writing instructions for the computer to perform computations.
A computer is just a machine for computing.
Computation is a general idea: a transformation of one form of information into another.
Richard Feynman Computer Science Lecture:
(He start with same notion)
It started with all courses teaching algorithms but on the day job you wrote all algorithms by searching it on web or via a library.
Now they teach language but you just ask agents to check the accuracy of code and rarely read it.
Only few devs wrote new algorithms and only few devs will now write the actual new code. These few devs don't need courses but all other devs need to pretend that they are part of these "few" so they need all the courses, just in case...
To be fair, to learn to think, you have to learn the language first.
Learning to program without knowing the language is useless and counter-productive.
Of course, this doesn't mean you have to learn 10+ languages first... but you have to learn a real programming language (not a toy one) before you can learn to program.
> To be fair, to learn to think, you have to learn the language first.
Which language is the language? A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately (with some exceptions that push outside the normal algorithmic language notation of the Fortran, C, Java, JS, Common Lisp, Rust, Go, etc. family of languages; but those are minority languages and a competent programmer shouldn't need more than a short period of time to become literate, if not expressive, in it).
> A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately
That's because the programmer already learned how to program.
But when they started, they definitely didn't write only pseudocode that wasn't runnable (to see the results) for months/years.
> they definitely didn't write only pseudocode that wasn't runnable (to see the results) for months/years.
GT started students that way and it worked well for years. A full semester (number varied, but was the CS 101 course, 1301/1311/1501 or something like that), taught with only pseudocode. They got rid of it because of appearances, trying to be like every other school out there. Eventually settling on Python, I think, after a brief stint with Scheme (which ended after a major cheating scandal).
This, we learn natural language like English first before we can use it to express ideas, argue for or against those ideas with evidence. The problem is not teaching a programming language, the problem is stopping there and not teaching how to use it to solve real problems.
In The Art Of Computer Programming, one of the most influential and comprehensive series of books on the subject, Knuth uses a fictional assembly language called MIX in the examples. The reader does "just run the program in their head."
In Software Tools Brian Kernighan and P.J. Plauger describe a pseudo-language called RATFOR (Rational Fortran), and then throughout the book implement RATFOR in itself.
Getting feedback while learning to program has a lot of value, but so does learning to think through code in your head. People old enough to remember when you had to wait a day to run your program and get results back (very slow turnaround) know the value of that skill, we used to call it "desk checking" -- reading through your code and running it in your head and on paper.
When I took an introductory programming class at Sacramento City College in fall 2004 during my senior year of high school, we spent the first half of the semester designing our programs using flowcharts and pseudocode. We were encouraged to check the logic of our flowcharts and pseudocode. In the second half of the semester, we implemented those programs in C++.
I haven’t seen this pedagogical practice in any other introductory course I’ve seen since. I believe it’s a holdover from the early days of computing, when programmers didn’t have access to personal computers or even interactive computing, which meant that programmers needed to spend more up-front time on design. Think of the punchcard era, for example.
I teach introductory programming in C++ at Ohlone College in Fremont, and I have my students write C++ on Day 1, starting with “Hello World” and going from there without flowcharts.
Yeah, but most* companies hire for just whatever X programming language they use, and do not care if you know how to program and do not care that you could pick up whatever X is in a couple of weeks. (Anecdotally for "most", I am sure there are exceptions)
It's funny, I learned a pile of languages in my undergrad and some UML nonsense. None of it covered properly how to write code that was meant to be read, which IMHO is one of the most basic things.
Just "hey nobody can understand why that line is the way it is, what should we do about that" is probably one of the basic building-block skills of developing on a team, and you teach it wholly by abusing prima donna cowboys until they write something legible or quit.
This isn't a plug/whatever - just good content from an old friend.
https://www.youtube.com/@ProfessorHankStalica
It always struck me as strange that universities never had a course that would teach open source code. As in: grab a repo of a popular open source project, read part of it and do your best to create a contribution in it.
The lectures should be about different open source projects and their design choices.
Data, data, data :))) Some basic notions to know: Input → Computation → Output
Information is omnipresent (this is just an intuition, not a claim). It serves as both input and output.
Computation—also known as a procedure, function, set of instructions, transformation, method, algorithm, or calculation.
In my early days, I ignored the fundamental notion of data and procedures. But eventually, it clicked: Programs = Data + Instructions
Watch Feynman on computing—he even starts with the same concept of data, introducing computers as information processing systems. And processing requires algorithms (i.e., instructions or procedures).
Programming is simply writing instructions for the computer to perform computations.
A computer is just a machine for computing.
Computation is a general idea: a transformation of one form of information into another.
Richard Feynman Computer Science Lecture: (He start with same notion)
https://www.youtube.com/watch?v=EKWGGDXe5MA
George Hotz video: what is programming? https://www.youtube.com/watch?v=N2bXEUSAiTI
Old documentry on programming: https://www.youtube.com/watch?v=dFZecokdHLo
https://denninginstitute.com/pjd/GP/gp_overview.html
Now they teach language but you just ask agents to check the accuracy of code and rarely read it.
Only few devs wrote new algorithms and only few devs will now write the actual new code. These few devs don't need courses but all other devs need to pretend that they are part of these "few" so they need all the courses, just in case...
Learning to program without knowing the language is useless and counter-productive.
Of course, this doesn't mean you have to learn 10+ languages first... but you have to learn a real programming language (not a toy one) before you can learn to program.
Edit: * a language
Which language is the language? A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately (with some exceptions that push outside the normal algorithmic language notation of the Fortran, C, Java, JS, Common Lisp, Rust, Go, etc. family of languages; but those are minority languages and a competent programmer shouldn't need more than a short period of time to become literate, if not expressive, in it).
> A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately
That's because the programmer already learned how to program.
But when they started, they definitely didn't write only pseudocode that wasn't runnable (to see the results) for months/years.
GT started students that way and it worked well for years. A full semester (number varied, but was the CS 101 course, 1301/1311/1501 or something like that), taught with only pseudocode. They got rid of it because of appearances, trying to be like every other school out there. Eventually settling on Python, I think, after a brief stint with Scheme (which ended after a major cheating scandal).
You need to learn to leetcode in psuedocode first.
I never see anyone learning to program using pseudocode (which isn't runnable to get feedback).
If they used pseudocode, were they just run the program in their heads?
In Software Tools Brian Kernighan and P.J. Plauger describe a pseudo-language called RATFOR (Rational Fortran), and then throughout the book implement RATFOR in itself.
Getting feedback while learning to program has a lot of value, but so does learning to think through code in your head. People old enough to remember when you had to wait a day to run your program and get results back (very slow turnaround) know the value of that skill, we used to call it "desk checking" -- reading through your code and running it in your head and on paper.
I haven’t seen this pedagogical practice in any other introductory course I’ve seen since. I believe it’s a holdover from the early days of computing, when programmers didn’t have access to personal computers or even interactive computing, which meant that programmers needed to spend more up-front time on design. Think of the punchcard era, for example.
I teach introductory programming in C++ at Ohlone College in Fremont, and I have my students write C++ on Day 1, starting with “Hello World” and going from there without flowcharts.
?
Just "hey nobody can understand why that line is the way it is, what should we do about that" is probably one of the basic building-block skills of developing on a team, and you teach it wholly by abusing prima donna cowboys until they write something legible or quit.
You can get all these fundamentals for free and probably better from an LLM.