Skip to main content

Posts

Showing posts with the label programming

Trust

When UNIX co-progenitor and super-smarty-pants Ken Ritchie was given a Turing Award, he provided a warning to those within ear shot. Admins and developers often find it satisfactory to review the source code of applications to determine maliciousness. And to a certain extent, this works out all right. Over time we have built a series of expectations of where to expect naughty code based on our experience. We have also chosen to trust other types of tools that we use during this process. We discriminate. But there's no reason that bad stuff *has* to be in the applications that we expect to find it in. Yes, the clever among us know that compilers can be bad. But we check the source of our compilers and find no bad stuff, and so we assume we are safe. We do, though, compile the compiler, don't we? Well, alright then some megalomaniac at Intel or somewhere far upstream decided to embed badness in the embedded distro compilation software. We can still look at the binary of com

Programming in C - Chapter II - It Really IS Rocket Science

Problems arise with numerical expression in computing. In reality, there are an infinite number of real numbers. However there is clearly not an infinite amount of infinite memory even in the largest of super-computers, and memory that is addressable by an application is only a fraction of the total finite available memory. How to we deal with these obstacles? We will explain more in a moment. First let's overview in more detail how the C compiler handles numeral types. Consider the application below: #include <stdio.h> int main (void) {     float f = 1 / 10;     printf("%.2f\n", f);     return 0; } Here we declare a float, 1/10 which should clearly resolve to 0.1 or 0.10 since I am declaring that printf provide a float with two digits after the decimal point. However, upon complation and excecution the program will stubbornly return a value of "0.00". Why? The issue is that I am declaring a float as an operation of two integers - 1 and

Programming in C - Before We Get Started

Requirements | Framework | POSIX Recently I have been spending quite a bit of time learning how to program in C. It has been quite a few years since I have had anything to do with C, spending most of my time in a very different OSI layer entirely. Even when I did come across it some time ago, I was never anything but barely competent - this ignorance on my part has always disturbed me, and so I have endeavored to do something about it. Currently I am taking a few computer science classes at Harvard University; it is my plan to summarize a portion of what I have taken from those classes concerning the C programming language here on my website for readers who are interested. For readers with an advanced knowledge of C, these readings could be viewed as refresher courses. The content, although filtered through my humble brain, will be entirely the result of knowledge acquired through Harvard, so my hope is that even for the experienced C hacker there may be something of interest.

Scratch from MIT & Back to School

As time goes on, having a knowing how to write in a programming language is becoming less of an odd and obscurantist lifestyle choice and more of a necessity for gainful employment. Already, anyone wanting to pursue a career in the hard scientists will be finding themselves either developing or working with custom applications. But even entry-level and intern positions frequently have a "please help us with our website / CMS / database" component to them. The trouble is, people are terrified of code; even very smart people. It looks like ancient greek. For students of ancient greek it looks like Farsi. For Persian students of the Asiatic classics it looks like, err, English, probably. My point is that going from using the internet for Facebook and using the internet for push requests on Github has a very steep learning curve. So steep that most people fall right the hell off the curve. Enter Scratch . Scratch is an object oriented programming language developed by the Sma