Tutorial :Long-held, incorrect programming assumptions [closed]


I am doing some research into common errors and poor assumptions made by junior (and perhaps senior) software engineers.

What was your longest-held assumption that was eventually corrected?

For example, I misunderstood that the size of an integer is not a standard and instead depends on the language and target. A bit embarrassing to state, but there it is.

Be frank; what firm belief did you have, and roughly how long did you maintain the assumption? It can be about an algorithm, a language, a programming concept, testing, or anything else about programming, programming languages, or computer science.


That XML namespaces (or worse, well formedness) are in some way more difficult than trying to do without them.

A very common blunder, even at the W3C!


My incorrect assumption: That while there's always some room for improvement, in my case, I am pretty much as good a programmer as I can be.

When I first got out of college, I'd already been programming C for 6 years, knew all about "structured programming", thought "OO" was just a fad, and thought "man, I am good!!"

10 years later, I was thinking "OK, back then I was nowhere near as good as I thought I was... now I get the ideas of polymorphism and how to write clean OO programs... now I'm really good".

So somehow, I was always really good, yet also always getting way better than I was earlier.

The penny dropped not long after that and I finally have "some" humility. There's always more to learn (have yet to write a proper program in a purely functional language like Haskell).


I think I was 10 years old when someone convinced me that there will be a computer capable of running an infinite loop in under 3 seconds.


In C++, during a long time I was tkinking that compiler rejects your when giving a definition for a pure virtual method.

I was astonished when realizing that I was mistaken.

Many times when I tell someone else to give a default implementation of its pure virtual destructor for its abstract class, he/she looks back at me with BIG eyes. And I know from here that a long discussion will follow ... It seems a common belief somewhat spread within C++ beginners (as I consider myself too .. I am still learning currently!)

wikipedia link to c++'s pure virtual methods


I was convinced, for at least 6 years, that every problem had exactly 1 solution.

Utterly unaware of multiple algorithms with differing complexities, space/time tradeoffs, OOP vs. Functional vs. Imperative, levels of abstraction and undecidable problems. When that blissful naivety broke, it opened up a world of possibilities and slammed the door on simply sitting down and building things. Took me a long time to figure out how to just pick one and run with it.


As an old procedural programmer, I didn't really understand OO when I first started programming in Java for a hobby project. Wrote lots of code without really understanding the point of interfaces, tried to maximize code re-use by forcing everything into an inheritance hierarchy - wishing Java had multiple inheritance when things wouldn't fit cleaning into one hierarchy. My code worked, but I wince at that early stuff now.

When I started reading about dynamic languages and trying to figure out a good one to learn, reading about Python's significant whitespace turned me off - I was convinced that I would hate that. But when I eventually learned Python, it became something I really like. We generally make the effort in whatever language to have consistent indent levels, but get nothing for it in return (other than the visual readability). In Python, I found that I wasn't doing any more effort than I had before with regard to indent levels, and Python handled what I'd been having to use braces or whatever for in other languages. It makes Python feel cleaner to me now.



That I'd be just designing and writing code.

No requirements gathering, documentation or supporting.



  • My co-workers were/are producing supposedly bad code because they sucked/suck. It took me a while to learn that I should first check what really happened. Most of the times, bad code was caused by lack of management, customers who didn't want to check what they really wanted and started changing their minds like there's no tomorrow, or other circunstances out of anyone's control, like economic crysis.
  • Customers demand "for yesterday" features because they are stupid: Not really. It's about communication. If someone tells them it everything can really be done in 1 week, guess what? they'll want it in 1 week.
  • "Never change code that works". This is not a good thing IMO. You obviously don't have to change what's really working. However, if you never change a piece of code because it's supposedly working and it's too complex to change, you may end up finding out that code isn't really doing what it's supposed to do. Eg: I've seen a sales commission calculation software doing wrong calculations for two years because nobody wanted to maintain the software. Nobody at sales knew about it. The formula was so complex they didn't really know how to check the numbers.


never met with integer promotion before... and thought that 'z' would hold 255 in this code:

unsigned char x = 1;  unsigned char y = 2;  unsigned char z = abs(x - y);  

correct value of z is 1


I just recently found out that over a million instructions are executed in a Hello World! c++ program I wrote. I never would have expected so much for anything as simple as a single cout statement


That goto's are harmful.

Now we us continue or break.


The OO is not necessarily better then non-OO.

i assumed that OO was always better.. then i discovered other techniques, such as functional programming, and had the realization that OO is not always better.


That C++ was the coolest language out there!


don't use advanced implementation-specific features because you might want to switch implementations "sometime". i've done this time and again, and almost invariably the switch never happened.


I am a young fledgling developer hoping to do it professionally because it's what I love and this is a list of opinions i once held that I have learned through my brief experience are wrong

The horrible mess you end up with when you don't seperate user interface from logic at all is acceptable and is how everyone writes software

There's no such thing as too much complexity, or abstraction

One Class One Responsability - I never really had this concept, it's been very formitive for me

Testing is something I don't need to do when I'm coding in my bedroom

I don't need source control because it's overkill for the projects I do

Developers do everything, we're supposed to know how to design icons and make awesome looking layouts

Dispose doesn't always need a finaliser

An exception should be thrown whenever any type of error occurs

Exceptions are for error cases, and a lot of the time it's OK to just return a value indicating failure. I've come to understand this recently, I've been saying it and still throwing exceptions for much longer

I cam write an application that has no bugs at all


That we as software engineers can understand what the user really wants.


That more comments are better. I've always tried to make my code as readable as possible--mainly because I'm almost certainly the guy that's going to fix the bug that I let slip by. So in years past, I used to have paragraphs after paragraphs of comments.

Eventually it dawned on me that there's a point where more comments--no matter how neatly structured--add no value and actually becomes a hassle to maintain. These days, I take the table-of-contents + footnotes approach and everyone's happier for it.


That the only localization/internationalization issue is translating messages.

I used to think that all other languages (and I had no concept of locales) were like English in all ways except for words and grammar. To localize/internationalize a piece of software, therefore, you only needed to have a translator translate the strings that are shown to the user. Then I began realizing:

  • Some languages are written right-to-left.
  • Some scripts use contextual shaping.
  • There is large variation in the way that dates, times, numbers, etc. are formatted.
  • Program icons and graphics can be meaningless or offensive to some groups of people.
  • Some languages have more than one "plural form".
  • ...

Even today I sometimes read about internationalization issues that surprise me.


I used to think that Internet Explorer 6 box model is an evil dumb idea MS came up with only to break compatibility with other browsers.

Lots of CSSing convinced me that it's much more logical, and can make the page design maintenance (changing blocks paddings/borders/margins) much easier.

Think about the physical world: changing the paddings or borders width of an A4 page doesn't change the page width, only reduce the space for the content.


  • Programming Language == Compiler/Interpreter
  • Programming Language == IDE
  • Programming Language == Standard Library


I used to think I was a pretty good programmer. Held that position for 2 years.

When you work in a vacuum, it's easy to fill the room :-D


That the now popular $ sign was illegal as part of a java/javascript identifier.


Thinking that I know everything about a certain language / topic in programming. Just not possible.


That virtual-machine architectures like Java and .NET were essentially worthless for anything except toy projects because of performance issues.

(Well, to be fair, maybe that WAS true at some point.)


It's important to subscribe to many RSS feeds, read many blogs and participate in open source projects.

I realized that, what is really important is that I spend more time doing coding. I have had the habit of reading and following many blogs, and while they are a rich source of information its really impossible to assimilate everything. It's very important to have balanced reading, and put more emphasis on practice.

Reg. open source, I'm afraid I won't be popular. I have tried participating in open source, and most of them in .NET. I'm appalled to see that many open source projects don't even follow a proper architecture. I saw one system in .NET not using a layered architecture, and database connection code was there all over the place including code behind, and I gave up.


That managers know what they talk about.


That my schooling would prepare me for a job in the field.


That learning the language is just learning the syntax, and the most common parts of the standard library.


That bytecode interpreted languages (like C# or F#) are slower than those reset - button - hogs that compile directly to machine code.

Well, when I started having that believe (in the 80s), it was true. However, even in C# - times I sometimes wondered if "putting that inner loop into a .cpp - file would make my app go faster").

Luckily, no.

Sadly, I just realized that a few years ago.


"It's going to work this time"

Note:If u also have question or solution just comment us below or mail us on toontricks1994@gmail.com
Next Post »