I've read a lot of "OO considered harmful" type stuff in the past. I guess, like Lisp, it was one of those things that I shrugged off as crazy talk - because I didn't understand it, and because the writer didn't seem to explain it. Actually, I need to qualify that a bit, with some history of my own opinions on the matter of objects.
When I first started using objects extensively (around the time I got ahold of VB5 with its "new" Classes support) I thought objects were kind of handy for certain kinds of data modelling, but really didn't see what all the fuss was about writing entire programs based on objects interacting. As far as I was concerned, the Logical Thing To Do was to use objects to model, well, object-like stuff, and use good-ol' procedural style code to model logic and high-level operations involving objects.
However, I was very much a novice and unaware programmer at the time, so I figured there was a good chance I didn't know what I was talking about. While the attitude was good, I made a terrible mistake, which I now (in retrospect) kind of regret - I moved off into Java land. A lot of people were saying that Java was gonna be Really Big, and you'd better learn it if you want to have relevant skills in the real world.
I tried - really, really tried - for about a month to like Java. It had some of the niceness of VB's "get it done and screw the details" philosophy, and at least by comparison to C was kind of handy in that regard. But one thing just pissed me off about it: Java seemed to have some kind of weird, almost erotic fixation on objects. I ditched Java permanently, which (frankly) I think was a good move on my part. I've hated it ever since.
At that point, I'd been doing parallel work in VB and C for several years. I loved VB for doing "GUI stuff" and maybe the occasional rapid throwaway utility or whatever. For anything heavy-duty, I used C. (Actually, what I used was C++'s flavor of C-style programming - it was trivially isomorphic to C code, but exploited litte gimmicks like omitting struct everywhere. It wasn't legal C in the sense that you could compile it in a C compiler straight, but it was for all intents and purposes C code.) It worked great - I had a potent tag-team of languages that could solve most of my problems. I could even write MS-DOS 6 compatible batch files if I needed to.
However, I felt like I had some kind of gap in my knowledge still, because I didn't quite get this whole "objects" thing. I was still a student looking for a teacher, and sadly, at that point in time, the easiest teacher to find was the OO fanatic camp. I wish I could have found a different teacher instead (like, say, Lisp). But in any case, what happened happened - and I decided to try to learn "real C++" in order to get a handle on this objects stuff.
In the course of learning C++'s flavor of OO, I realized that OO was a vacuous (or, at best, nebulous) term. Every single language had a different notion of how OO should work, and nobody agreed that anyone else's was better (except maybe Smalltalk's). Nobody seemed to agree what OO really was - it just seemed to involve a lot of objects.
Even more unfortunate for me, I chose MFC as my entry point to the C++ land of OO. It permanently warped my opinion of the entire objects notion, although eventually I think the experience will prove beneficial, all things considered - if for nothing else than the fact that it has deeply broadened my experience. Call it an eye-opener, I guess.
I dabbled in this land for a while, doing a few little projects of my own, but never really liking it. I decided that my real problem was that I didn't have a good, hard problem to solve; I was just tooling around, and didn't have room to get a real solid feel for how Objects are supposed to be.
After a couple years of dabbling, I started the Day Job From Hell. (As a matter of fact, I'd "dabbled" a bit by writing a prototype version of what eventually became the product I worked on during that job. But that's another tale.) I figured it was my lucky break: the main thing I was to work on was - you guessed it - a C++, OO-heavy, MFC-encumbered mess. Of course, at that point, it was like some kind of vision from heaven; finally, I could get some clarity on all this OO stuff!
I wrestled, hard, with that project at first. I tried - really tried - to do everything in OO style, like Java. Except at every turn, I had this deep revulsion; I felt like I was going back into Java-land. Java was supposedly this great, highly OO language, but I hated it. Something wasn't adding up in my head.
Around that time I picked up a copy of The Pragmatic Programmer. Finally, things started to click, and it was the beginning of the fastest period of acceleration in my personal understanding of programming that has happened to date (in fact, I think I'm still on that upward trend, and possibly still accelerating a bit). Instead of trying to pursue this ghost, this OO, this object-worshiping nothingness that never seemed to materialize, I started using Pragmatic principles instead.
I rewrote the program, almost 100% from scratch, against very heavy protests from the management. I firmly believe that I made the right decision. Even now, I think the management is starting to grudgingly realize that it was the right thing to do; they've had a far more stable and reliable product even since I quit than they did when it was being actively maintained by the old developer.
After I got done rewriting the entire program, I noticed something funny: it deeply - and I mean very very deeply - resembled what I used to do in VB, all those years ago, when I first got my hands on "classes." Maybe a third of the "stuff" in the program was thought of as an object. The rest was basically procedural code, except with some of the nice trappings of C++ (STL, RAII, and such) to help smooth over the ugliness of doing applications in raw C.
For a while, I felt vaguely guilty about this. I felt like I'd betrayed my quest to Learn Objects, like I'd missed the mark somehow. I thought I had failed to understand the Grand Truth of OO, and that I was committing a deep sin by writing in the same style that I used to write VB code. I mean, hell, everyone knows VB is the worst language ever, right?
So I thought long and hard about this, in the back of my head, while other thoughts filled up my conscious effort. I realize now just how long this has been rolling around in my brain, but has only now attained clarity. The more I thought about it, the more I realized that I couldn't identify a "crime" in that code. Yeah, it wasn't Java-style OO, but it was exceptionally good code, compared to my previous stuff. Maybe it missed the mark of "the grand truth of OO" but it definitely lined up with what Pragmatic Programmer had to say. It wasn't object-laden, but I was still proud of that architecture.
I've been doing a lot of reading to back up my efforts on this Epoch thing. As a result, I've been finding out a lot about the real situation of OO. It seems that a lot of people have arrived at this conclusion long before me; and a lot of people seem to despise OO, at least in the sense of Java's "use objects or die" approach.
Now, though, I see a new perspective. I don't think the problem here is really objects per se - I think it's object orientation. And the more I read, the more I think that this is what all the anti-OO people have really been driving at all along; I just wasn't smart enough to understand it yet.
My problem was premature rejection. I read this stuff that, to me, seemed to be saying "objects are stupid! Use Lisp instead." And this bothered me. I had all kinds of cases where objects were exceptionally good representations of certain classes of problems. Today, I'd say that objects are probably the best method (for now at least) of modelling systems that are dominated by automata. A simulation game like X3 would be a damned nightmare to write without objects.
So, I would read these anti-OO discussions, and figure everyone out there was insane. I could see the benefits and power of objects plain as day, and these people seemed to be telling me that objects were, in fact, not useful. So I largely ignored the whole thing, assuming that once I found the Great Truth of OO, I'd be able to counter their arguments.
I did find the Great Truth of OO, but the truth isn't that the Emperor is also the Messiah. The truth is that the Emperor has had a catastrophic wardrobe malfunction.
Objects are awesome tools of abstraction. I think they should stick around, and here's what I think the term should mean:
- An object has some state, also called attributes or properties.
- An object knows how to do interesting things with its own state.
That's all. No more, no less. Encapsulation, implementation hiding, data hiding, all these things are good - but they are not inherent properties of objects, nor do objects hold a monopoly on those notions. In this sense, I think, objects are still very good tools.
Where we get into evil trouble is when we try to write all of our logic as objects. The presence of "manager" or "handler" objects is a tell-tale sign of this. Steve Yegge posted a humorous caricature of this kind of programming in Execution in the Kingdom of Nouns. I think, to a large degree, this kind of stuff is what "object-orientation" is all about: if you have code to write, find a way to cram it into an "object," a noun.
I think that we now have enough history (thanks to Java) to prove that this is a Bad Way To Do Things.
Inevitably, the question of objects is going to come up in the Epoch project. Up until now, I've been basically planning on declaring Epoch to support object-orientation. Now, though, I've changed my mind. Epoch will revile object-orientation. Epoch will spit upon it, deface it, shame it, and tell it to go back to Java land where it belongs, so that we can get work done in peace.
Epoch will have a new perspective. Well, I don't think the perspective itself is really all that new; I just don't think anyone has codified it before. I think a lot of people have become disillusioned with OO (if they were ever "illusioned" to begin with), and already have this perspective. But I've never seen a name for it, or even a really concise description of what it entails - just rhetoric about why OO is bad.
I've started thinking about it as "object awareness." Objects exist, and they're very useful tools in certain areas. As such, I think it's important to allow for them. Epoch will definitely "believe in" objects in that it won't be object-agnostic, the way, say, BASIC or C is. Objects themselves will be welcome citizens. Heck, at this point, I'm moving in the direction of modelling Epoch's language within itself, using self-referential, recursive objects.
Where the line is drawn, though, is in idolizing objects. Some things just shouldn't be objects. I think the term "object oriented" carries a sort of connotation of bending everything in the direction of objects. Objects aren't merely first-class citizens; they're aristocrats, maybe even dictators. That, I think, is the essence of all that is wrong with OO.
Here's to Object-Aware Programming. May our code be more concise, our abstractions more clean, and our modules less cluttered with FooManagers.
Python is object oriented, yet doesn't require that all your code be expressed as an explicit object attribute or operation - even though everything is an object in Python.
As you can see, even a "bare" function is an object - an instance of the function type - with a slew of accessible and modifiable properties, each also an object, and the availability of these attributes makes the language that much more powerful. Access to the properties is precisely what enables metaclass hacking without deep black magic (as would be required in C++), whatever its actual uses in production code.
There's nothing wrong with object oriented languages, so long as their interpretation of object orientation does not limit programmer expressivity. Don't swing from one extreme (Java) to another (the direction in which you seem to be taking Epoch, whatever that is).