Dennis Ritchie: the giant whose shoulders we stand on

@Karlos, Yes, I think we agree. There are few golden rules in C++, which is why I said "you should have a good reason why you're not doing: object.ack();" instead of "you should always do: object.ack();". As long as you've thought about it and have come up with a reasonable answer, then it's probably ok. Fluffy gave an equally valid response (although it's the one I hate the most ;)).

What I hate is when programmers learn a new trick and start using it everywhere. It's the good old when you have a new hammer everything looks like a nail scenario. I've seen so many misapplied design patterns it isn't even funny.
 
What I hate is when programmers learn a new trick and start using it everywhere. It's the good old when you have a new hammer everything looks like a nail scenario. I've seen so many misapplied design patterns it isn't even funny.
Just learned to create and use annotations and reflection, so a note to self this is :oops:
Though I always conciously pick my approaches, however.
 
@Karlos, Yes, I think we agree. There are few golden rules in C++, which is why I said "you should have a good reason why you're not doing: object.ack();" instead of "you should always do: object.ack();".

Indeed, that's the pragmatic answer. However, I've encountered many puritans that would skin you alive for not making everything belong to a class, principally because they've come from an OO background where it's the only available choice; typically Java. Of course, there's nothing stopping you from writing C++ code that way, indeed the only function that you have to write as a "global" non-member is main(). Indeed, in my own code base, main() is actually implemented by my common linker library and the application must extend the abstract Application class, which the library instantiates and manages (via a factory method you must also implement, usually "return new MyFunkyApplication();" is sufficient, but any other feasible method that returns a valid Application* would work). A typical hello world would look like this:

Code:
#include <xbase.hpp>
#include <cstdio>
class HelloWorld : public Application
{
  public:
    int32  run(); // note: virtual

  public:
    HelloWorld() {}
    ~MainApplication() {} // note: virtual
};

// Required factory
Application* Application::createInstance()
{
  return new HelloWorld();
}

void Application::destroyInstance(Application* app)
{
  delete app;
}

// Stuff...
int32 HelloWorld::run()
{
  std::printf("hello world!\n");
  return 0;
}
Obviously that's a lot bigger than the traditional main(), but it's also a lot more capable. Application provides you with methods to retrieve validated CLI parameters and so on.
 
Ya, sticking to OO is great, but it's only great if the original modal was designed well. If not, get ready for a real headache. Just throwing some functions into a class doesn't mean it's OO either. OO demands you know what you're doing - and that rarely happens. But ya, overall I do my best to stick to OO practices as I certainly do see value in it. And you're right, there are all sorts of puritans out there. Always fun to debate the pros and cons of multiple inheritance with Java fanatics. ;)
 
Ya, sticking to OO is great, but it's only great if the original modal was designed well. If not, get ready for a real headache. Just throwing some functions into a class doesn't mean it's OO either. OO demands you know what you're doing - and that rarely happens. But ya, overall I do my best to stick to OO practices as I certainly do see value in it. And you're right, there are all sorts of puritans out there. Always fun to debate the pros and cons of multiple inheritance with Java fanatics. ;)
AO is an 'expansion' to OO as far as I understand the matter. OO isn't always the answer. One programs a PLC for instance in quite a different way.
 
Aspect oriented programming is indeed very useful for implementing cross-cutting concerns such as logging, caching and so forth. Usually, in the form of syntactical sugar that translates into some auto-generated code.
 
Back on topic, I wonder how long the other giants have left? Donald Knuth must be getting on a bit by now.
 
It's funny how young the field is and yet how much it has progressed. We've gone from huge factory sized computers that did little more than blink lights, to tiny little nanometer processors that fit in your pocket and blink lights. Pretty amazing. :D
 
Quite. The progress made in controlled light blinking is staggering. Eventually, we might do away with a lot of the electronic side and have devices in which the light blinking's logic is processed entirely via... blinking lights...

Courtesy of IBM (http://en.wikipedia.org/wiki/Blinkenlights)
ACHTUNG!
ALLES TURISTEN UND NONTEKNISCHEN LOOKENPEEPERS!DAS KOMPUTERMASCHINE IST NICHT FÜR DER GEFINGERPOKEN UND MITTENGRABEN! ODERWISE IST EASY TO SCHNAPPEN DER SPRINGENWERK, BLOWENFUSEN UND POPPENCORKEN MIT SPITZENSPARKSEN.IST NICHT FÜR GEWERKEN BEI DUMMKOPFEN. DER RUBBERNECKEN SIGHTSEEREN KEEPEN DAS COTTONPICKEN HÄNDER IN DAS POCKETS MUSS.ZO RELAXEN UND WATSCHEN DER BLINKENLICHTEN.
 
Absolutely. The future is not only bright, it's blinking!
 
Back
Top