Open Ballot: How valuable is a Computer Science degree?

TuxRadar

Out of the staff here at Linux Format, only one of us actually has a Computer Science degree. The rest of us ended up in the job as a result of our hobbies, random hacking and volunteering in various open source communities.

This got us thinking, how worthwhile are Computer Science degrees? Many technology companies complain that graduates, even of Computer Science, arrive with little understanding of how to work in industry - knowledge of version control and the like - and often lack knowledge of basic coding paradigms.

This seems a great shame to us, especially when there's so many high quality open source projects out there who'd love some students to help them out, and who they'd be happy to mentor in return. Is the way forward more projects like Google's Summer of Code, or Red Hat's partnership with Seneca College?

We'll be recording our next podcast on Wednesday, so we want to hear from you. Let us know what you think about computer science and real world work in the comments, and we'll read out the most interesting answers in the podcast.

You should follow us on Identi.ca or Twitter


Your comments

Something not so mainstream

When I came to choosing what degree course to apply for, I was pretty sure a Computer Science degree would probably consist of an awful lot of teaching that I didn't require, since I'd already picked up quite a lot of knowledge by that point.

In the end, I decided to go for something a little off the beaten track, and chose a degree in Artificial Intelligence and Computer Science, which was 50% CS and 50% AI. I found that the CS part introduced me to some of the actual science of computers: the more analytical and mathematical aspects. The AI part taught me some really interesting non-mainstream techniques that can be applied to tough problems. They also taught us some unusual languages, such as Prolog and POP-11.

All in all, I don't think I learnt anything crucial at university. A degree gives your CV a certain level of credibility, although I might have been able to achieve the same thing if I'd spent the time hacking open source projects instead. Having said that, the Artificial Intelligence element is something that isn't all that easy to teach yourself, and I'm glad I have some basics in AI.

There's Degree and them there's Degree

Passing and getting a degree is not the same as learning something with it. I believe that the question is not general enough since, I honestly think, my opening statement is true for every and any degree. Now to answer the question: the degree is as valuable as you make it :D

I'm doing MEng Soft Eng

AND YOU TELL ME THIS NOW?!?!

I do get the impression that a degree is something that you put on your cv, but from all the employment events that my uni has run in the past couple of weeks, it would seem that 'everyone has a degree and everyone has done a group project'. Employers are looking for the extra stuff you have done out side of your course, and if you can prove it. A quote that comes to mind is 'is your only example of team work is your group project at university, then you have a problem'
It almost brings me to tears when final year students don't know how to install linux and when questions get asked on our facebook group that take 1/2 a second to google and get an answer. (Although the flame wars that ensue can be quite amusing).

The uni and dept really push students to do a year in industry to get a years experience doing real things. We can all do the theoretical stuff but its not much use if you can't put it into practice.

University is good at giving you an overview of lots of different subject areas, but they don't force students to get lots of hands on experience. Sure there are assignments but they have their limitations as they need to be marked. It is up to the students to do other stuff, be that getting involved with GSoC or doing some open source stuff

Just my 2 cents =)

It is the teachers...

I have noticed that work adds no longer specify level of education, at least not as often as before. Companies have learned that a self thought enthusiast in our field often can beat anyone with a degree. This makes the process of hiring someone very hard, I imagine.

The best would of course be to have both - both the interest and the degree. And a possibility for us who like to tinker to be able to validate our knowledge. Unfortunately that is hard and often not an option.

The reason for students getting degrees in computer sciences but not really learning to code or anything else useful is a symptom of the same problem. If the ones who really know what they are talking about can't validate their knowledge without talking the long route, they won't be the ones who teach, and thus the ones who have no real interest are thought by teachers without the real practical know-how.

There are always exceptions, of course. But this is a part of the human knowledge that is constantly moving forward and reinventing itself.

To get students involved in programs like the Google Summer of Code and others would be ideal, but unfortunately few teachers know or care about such endeavours.

Modern education is

Modern education is flawed.

IMHO - it would be much better if people were allowed to learn things by themselves and only come to official place for exams.

You can absolutely study things without college. Especially when it comes to computer science or math.

Old school...

I got my B.Sc (Hons) Computer Science in 1987. I learn most of my coding skills back then and given the practical skills I learn it's been a god send. Now a days, it's very handy to have B.Sc. on my CV as I do freelance work.

But could I have got along as well without it? I'm not sure...

My plans

I am thinking of doing a course in computer sciences in Aberystwyth. They do a course there specifically in Open Source computing and I went there on the 31st of October to seminar by RMS :).

Looking around at other universities none seem as free software friendly as aber. So I would like to ask beyond the open ballot, how valuable is a degree in open source computing?

ZX80 and Raspberry Pi

I don't know about a degree, but certainly an interest in coding at an early age would be good. My introduction was in borrowing a Sinclair ZX80 for a weekend. Recently, Radio 4's 'Click', featured the soon-coming production of the Raspberry Pi board which is to use Fedora. They were talking of $25 (£15) per credit-card board. Certainly, flooding schools with this startup technology would be no bad thing.

Assuming cost=value...

...then anything from $19,000 to $97,000 from what I see quited on-line. I'm sure there are some dodgy ones you can get much cheaper, and I expect the value is similarly low.

Taught by Experts

There is a lot of shoddy, insecure software about (even outside of Windows). Much of it written by self-taught, undisciplined people.

You don't get to design other works of engineering, aircraft, bridges, tower blocks etc, without proper training and the resulting qualifications.

The computer world would be much better off if similar standards were applied. Letting the self-taught build mission critical software is akin to letting people who have taught themselves to build bridges out of Lego, build the real thing. Tacoma Narrows would become the rule rather than the exception. The computing equivalent of Tacoma Narrows is happening on most of our computers, most of the time.

Weather or not all degree courses provide the required discipline is a separate issue.

Theory vs Practice

Your question has many aspects to it. One such aspect is the question "which is it more important to do: study the theory ( college degree ), or practice the subject ( code a lot )". The answer I give to this is my definition of theory. Theory is the stuff that you study while practicing ( as in practicing medicine, not practicing the violin ), that eventually you look at and "say not one piece of theory I have been studying has made any difference in my practice", so you stop studying it. Two months later there is a problem in your practice that bites you in the ass ( bum for you Brits ) because you don't know the theory you stopped studying.

In other words practice is paramount, but you still have to make sure you are steeped in the fundamentals of the theory.

Qualification worth paper it is written on?

I have a Computer Science degree, but is not used for my job as someone who profession is to stick a knife into people. As a hobby, I do some programming and despite my degree my programs are a pile of junk really; thank goodness I don't have to earn a living doing it. I reckon while earning a degree of any sort takes one through a process that trains you to think, communicate and act, the real skill comes from application not from learning. I am not surprised that the vast majority of skilled developers and computer journalists acquired these skills through practice.

Depends on the course opportunities

I did what at the time (the 1980s) was a unique degree aimed not at developers and programmers but at consultants; so half the course was principles of computing - which I have found invaluable ever since because the principles largely remain the same even when the software changes.- and half was managing organisational change.

Ironically, I have never been paid to use any of the computing side but earned lots of money doing the organisational side as a consultant.

However, having been both a tutor and a student in higher education, I would say that, while the title of the degree may be important to the employer, the opportunities to expand your horizons during the course are far more important. I happened to get a project working on presenting company accounts to the corporate board which meant I had to learn the principles of accounting which has stood me in good stead not only in managing my own business but in a variety of other situations.

Wherever I was supervising students doing projects, I always tried to make sure the projects were something which would expand the students' horizons as well as being something they were interested in .

principles

I've only returned to hobby programming (well, scripting actually) since being introduced to linux and BASH. However even though there were many years between, I still appreciate and use the skills I learned.
Another thing about studying for a degree is that you learn problem solving skills.

Foundation Degree Computing

Money, money, money.
I did it the reverse way round, I never finished school, worked 15 years in video production and now I'm at university.
I took the degree I'm on and borrowed the money to do it because I love the subject.

Does that make me an idiot.

Yes.

But I'm enjoying it. Well, not really the system analysis part but all the other stuff is great.
And I got to meet RMS the other day and some bloke from compTIA+ gave me a big fat Linux book for free because I was the only one in the lecture who knew what he hell he was talking about.

Anyway, this course has a 6 week placement (6 weeks, I know what's the point in that) in the middle and I'm not comfortable leaving it to the tutors to find anything good so if anyone has any good idea's I'm all ears. I looked at FSFE but you have to go to Berlin, can't find any intern stuff with Redhat outside the US.

Its important

First of all I think it heavily depends on the institution. Education now is a very big business and you can get an degree as long as you pay for it.

Let's assume you studied at a good university, you probably learned many "boring" stuff. However, it is exactly this boring stuff which finally makes the difference.
Boring stuff like:
Software project planning,
UML,
Code management,
Clean syntax,
Algorithms,
etc.
as well as all the basics....
Assembler, Compiler, Microprocessor architecture, Network topologie, Database architectures, etc.

This makes a hughe differences if you really have to code for a living. Sure some people might catch up by self-study but I guess there is often a gap left which makes them struggle in many situations.

Its in some way like learning Latin if you want to become a Linguist... maybe useless by itself but as tool to understand other languages very powerful.

In summary, a computer science degree does guarantee that one learned about the basics and one has an certain idea of important knowledge, which is often put beside because of its boring nature, otherwise.

make a test

It's easy make a test:
Ask the degree holder which basics did he learn?
Which theoretical topcis did he had to learn?
Which courses did he take which he found enlightening?
How often he learn about some new stuff and thinks by himself... "Hey that's very similar to what I did years back in school?"

Now ask the non-degree holders....

Do you have solid knowledge of the mentioned topics?
Did you ever thought to study them, but never found the time?
Have you ever been in a situation where you found yourself having problems to understand a topic because you notice you are missing the basics?

Honest answers please ;)

About degrees for me...

I went to University straight after school and got a guaranteed engineering post with the company who sponsored my studies. I got treated well for my efforts. I progressed up the ranks and eventually left for greener pastures. In the process I also completed a diploma in business administration. Fast forward to today... Today, the roles are reversed and I have my own company and have to make the decisions about which promising candidates to employ for the good of the business.

If I reflect back on my own experience, a degree or any noteworthy (at least somewhat demanding) qualification will definitely sway my vote if I had to choose between two otherwise similar candidates.

Having a qualification shows that the candidate has the ability to learn and some degree of discipline. I also know from experience that even though you might not remember all the details, when faced with a problem you're like a professional tradesman with a full bag of problem solving tools. If you're a self thought individual, you most likely have a much smaller tool set like the guy who fixes everything with a shifting spanner and duct tape. You take longer, and the end result is often messy!

That said, education is only half of the equation. Some of the most intelligent people I know are also the most unproductive. It is very important to verify that a candidate can convert their knowledge into useful output at an acceptable rate. Previous projects come into consideration here and a performance evaluation clause in the employment contract.

I never earned a computer science degree...

...but my GCSE in IT has served me well over the years. While it may have been over 20 years, I still find time to make Ceefax pages and find programming in BBC BASIC to be a never ending source of usefulness.

On a slightly less sarcastic note, even my 5 year old ECDL is looking very dated. Heaven help those people who dedicate years to a qualification only to find themselves out of work for even a year afterwards. No experience can easily mean no job.

Not one employer has ever asked to see my degree

I graduated with a BSc in Computer Science in 1979 and not one employer (and there have been over 20) has ever asked to see it. I've never used the letters after my name (far too pretentious). I am, however, very glad that I did study the degree as it gave me a very good foundation in the concepts of IT. I worked as a programmer for 20 years and a systems administrator for 10 years.

I know a few secondary school IT teachers, and from what I can tell, the IT lessons consist of training students to use whatever version of M$ Office is installed on the school network. This is seen as being useful for the students future. Do they forget that by the time these students are in employment, whatever office suite will have moved on? I've been in IT over 30 years, but only the last 5 has been supporting M$ machines (I've been lucky!).

Experience is what is really needed. How you teach that in schools is a mystery, but it should be possible.

but can't spell!

I notice from some of the previous comments that having a degree in anything computer related doesn't appear to include spelling or malpropisms.

My humble attempts at programming ZX81s and BBC Bs always forced you to check what you'd written as typos and spelling mistakes invariably caused your new spangly code to fail to run!

End of mini rant. I'll have to go now as the nurse is waiting to give me my tablets. :)

We Find It Almost A Disadvantage

I've employed quite a few people over the years and it's generally a bit of a disadvantage, more than anything. We find that for the middle-level jobs those people fairly fresh out of university don't have the skills and experience required. If they go for the lower level jobs they then either expect to be paid more, just because they have a degree - yes, our pay scales do have ranges, depending on experience and qualifications, but they often expected more than the top-end of the job's pay range - or they were just using us as a stepping-stone to get that first bit of "experience" and wouldn't be with us very long, so it was pointless employing them. We found that nearly every time we'd give an interview to someone with A-Levels and experience over someone with a degree, so it was almost more of a hindrance.
I would say that if you want to go into a more specialist area then find a course that caters for that, but otherwise seriously think about whether you want to do a degree.
The other option is to do one as part of your employment development once you've got experience, like through the OU.

Well, every kid at YRS was

Well, every kid at YRS was self taught.... I think the problem is with teaching, even though I'm sure a CS degree is great (at the right Uni), it's only going to be really useful on your CV, because if you've got a passion for technology, you most likely know a lot of it anyway.

I think it boils down to the fact that unless you're going to be a doctor or a lawyer, you should choose the degree that interests you most, and if that's ancient symbology, go for it.

Mythos..

The Computer Science's suffer from the same kind of myths that pervade a lot of cultural activity.
The Myth being you have to be some kind-of child genius before your any good - and the other classic "anyone can get a Degree."

Employers are just as susceptible to these Myth's as the rest of us.

The value of any Degree is in learning how to learn and critically evaluate information.

Is the Degree valuable? I think so. But I wouldn't use it as a yard stick of a persons knowledge. More a potential yard stick of some ones commitment to the subject.

P.S. I'd also question the cleverness of accessing value against the whims of technological,industrial need.

Not much use

I've been a developer for a few years and am completely self-taught. The majority of people who I have worked with in the industry were also self-taught too.

When we were hiring recently and had to do interviews - the candidates who had degrees were often not as good as those who had more real-world experience.

I have a degree

I have a degree and am unable to get work in the industry for some reason, during my time at university I set up a version control system for my tutor to access my work at any time and to give myself redundancy should I loose my work or overwrite an important file.

I found my time at university was spent teaching me things that were useless. Small tutorials that wouldn't apply in real coding projects. This is frustrating because I knew from experimenting with open source projects that what I was being taught wouldn't work outside of a computer lab.

I do everything I can to make myself employable in the IT industry from reading technical articles daily to releasing my own software projects and still no luck.

In my own experience a degree doesn't seem to be worth the paper it's written on. The only job I have ever been able to get is part time work in a bingo hall.

Perhaps I should have gone into banking...

matters to some and not to others …

I started in the software biz 40+ years ago (mainframes) and am now doing iPhone and Android. When I started, there was no ComSci degree, it was usually mathematics or engineering. In fact, having worked for some like Amdahl, Cisco, and HP, it my not even having a degree mattered at all (albeit I did get those jobs via the adverts or via my peer "networking" circles).

However, I was turned down by Google as they are very hoyty-toyty about degrees there; their bloody loss.

So, it may or may not matter depending on your experience and who you are interviewing with (and if you get some HR dummy that doesn't know Kobal from Cobol or Sea from C).

Masters of Computing with Honors and Scars

I graduated from a four year Masters of Computing degree ( MComp ~= MEng) from a the University of Sheffield this year.

The first three years are like a "normal" CompSci BSc in that we learn things, we sit tests, then we forget them.

However my Department also did large group projects a bit like no other.

In the first year we used the waterfall model and formal (UML, Z-specification) methods to exactly design software for a pseudo-client.

The twist was that at each stage in waterfall, we swapped projects with another team, got their design work, and had to begin the next step. The designs all had quirks and oddities, and everyone made a mess of implementing. But through failure, lessons are learned.

# Waterfall is stupid
# Communication is important
# UML is a sledgehammer
# Complete up-front formal specification is over-kill for all but the most strict of projects

In the second year, we worked for actual real-life clients. This time we were using eXtreme-Programming, with minimal documentation and rapid dev cycles and regular feedback from the clients.

This again fell on its face, but what project wouldn't working in a clunky, poorly implemented, internal PHP-clone of Ruby on Rails...

Lessons:
# Clients change their minds, alot
# "agile" methods are great to deal with this
# Learning that sometimes it may have been better to say "no" to new features at the 11th hour.

The third year didn't have a large group project, as we were all doing individual dissertations. Mine was on NFS performance in computing grids (hit: its terrible!)

The final Fourth year is the crowning glory of the course.

A student-run agile software house, supported by a University spin-off software company.

We were given crash courses in Ruby on Rails and agile develipment, and grouped loosely into teams:

# Communications (management)
# Development (4 parallel teams working on individual projects)
# Quality Engineering (checking our designs, writing unit tests, testing code)

We ran the internal management structure, each team had a leader and each team had a line-manger in the Comms team).

Comms organised finding us clients (typically departments within the University with an itch to scratch), and we gathered the requirements, laid out plans and basic designs.

We developed (in RoR) in an agile way, adapting as the clients' needs as they inevitably evolved and delivered, on-time, in a mostly working order.

The company was composed of those studying the four-year course, and a much larger group of Masters one-year students from various Universities nationally and around the world. On the face of it, all should have started out as equals, minor language barriers aside. However, the main lesson learned was

"A Computer Science degree, a programmer does not make".

In the end, CS degrees can be a useful learning experience, if not for learning new things, then for learning to avoid the mistakes of the past.

Fostering early interest with wider availability of Computing A-levels or a new Computing GCSE (and not boring IT) and encouraging individual learning and experimentation (RaspberyPi :D) will do more good for the software industry and make sure more "programmers" take Computer Science degrees in the future.

/*
Feel free to skip or abbreviate the first three years, its the fourth year masters bit that says the most
*/

Oh, and furthermore ..

I've tech interviewed many folks from all over the planet (many from the famed-tech-centers in Asia) who even with their degrees ended up not being able to develop their way out of a wet paper bag.

So, just having a degree (ComSci or other) is no indication of true potential. What one wants as a potential employer is: talent, good attitude, drive, and tenacity. (Again, a degree is no indication of true potential-talent; it's just an indicator you could study and pass your tests.)

(revised) Masters of Computer with Honors and Scars

/*
Ok so re-reading my original post it was monstrous, sorry for that. Pod-friendly format with a better message below:
*/

I graduated from a four year Masters of Computing degree ( MComp ~= MEng) from a the University of Sheffield this year, and since finishing I've been working at a small PHP-house.

In the fourth year of my degree, we did a large 40 credit module over the whole year which involved running our own agile software house (~35 people, 4 development teams, a testing team and managers).

We developed (in Ruby on Rails), for clients that we had found ourselves. We used agile methods and adapted as the clients' needs inevitably evolved. And we delivered software to three of four clients, on-time, in a mostly working order.

The 'company' was composed of those like me studying the four-year course, and a much larger group of post-graduate Masters students from various Universities nationally and internationally.

On the face of it, all of us should have started out as equals, minor language barriers aside. However, many of the one-year Masters students had poor programming skills, and limited understanding of the basics of Computer Science.

This proved to be a major point of frustration in the teams, as some were bad enough that they would have been fired had they been actual employees.

What I learned most from the project was:

"A Computer Science degree, a programmer does not make."

However, I firmly believe that:

"A programmer makes a Computer Science degree."

Fostering early interest in programming with a wider availability of Computing A-levels / Btecs or a new Computing GCSE (and not boring IT) and encouraging individual learning and experimentation (RaspberyPi :D) will make for more inquisitive students who do more than sit exams and receive a certificate.

(revised) Masters of Computer with Honors and Scars

/*
Ok so re-reading my original post it was monstrous, sorry for that. Pod-friendly format with a better message below:
*/

I graduated from a four year Masters of Computing degree ( MComp ~= MEng) from a the University of Sheffield this year, and since finishing I've been working at a small PHP-house.

In the fourth year of my degree, we did a large 40 credit module over the whole year which involved running our own agile software house (~35 people, 4 development teams, a testing team and managers).

We developed (in Ruby on Rails), for clients that we had found ourselves. We used agile methods and adapted as the clients' needs inevitably evolved. And we delivered software to three of four clients, on-time, in a mostly working order.

The 'company' was composed of those like me studying the four-year course, and a much larger group of post-graduate Masters students from various Universities nationally and internationally.

On the face of it, all of us should have started out as equals, minor language barriers aside. However, many of the one-year Masters students had poor programming skills, and limited understanding of the basics of Computer Science.

This proved to be a major point of frustration in the teams, as some were bad enough that they would have been fired had they been actual employees.

What I learned most from the project was:

"A Computer Science degree, a programmer does not make."

However, I firmly believe that:

"A programmer makes a Computer Science degree."

Fostering early interest in programming with a wider availability of Computing A-levels / Btecs or a new Computing GCSE (and not boring IT) and encouraging individual learning and experimentation (RaspberyPi :D) will make for more inquisitive students who do more than sit exams and receive a certificate.

It's a trophy

It's a trophy, good for display only, but not much good for anything else.. Just like those ornamental swords hanging on the wall.. Can't beat the humble kitchen chopper that's in daily use.

Very. Although not always required.

I find that most of the people who look down on a C.S degree or who think it isn't necessary are the ones who don't have one. You don't always need a degree to do useful work. Web development comes to mind. Plenty of mechanics get by just fine without mechanical engineering degrees. However, there is also a place for mechanical engineers and they need to be property educated, not just 'self-taught.' Even everyday grease monkeys would benefit from a proper education. I think a C.S degree is valuable to anybody working in the field whether they specifically need it to accomplish their tasks or not.

I also hate the assumption that computer science = programming. Nothing could be further from the truth. I am found of the quote from Dijkstra

"Computer science is no more about computers than astronomy is about telescopes."

I'd also like to echo the comments of "Taught by Experts"

A person who did all the right things during their C.S education is a person who can think critically and solve hard problems correctly using a computer as the solution. A quality which is not impossible to find in the self-taught, but much more rare.

Useful But Pricey

I learned a lot while getting my computer science degree, but not nearly as much as I wanted to. And it sure as hell didn't require me to become a good programmer (though I tried hard to become one anyway).

I learned about databases, programming language design, algorithms, and object-oriented programming. It was mostly overview. There was certainly a lot of food for thought. But it usually wasn't the kind of knowledge you could take to work with you and use to get shit done.

My degree is only a year old and already I feel like I have learned more in the past year (at work and in my free time) than I did during my time in college. I've met good programmers with no CS degree, and poor programmers who did have degrees. There's no question that my schooling was beneficial to me, but when I consider what I gained and weigh it against the debt I am now paying off, I feel irritated and a little ripped off.

I had a better time in some of the courses outside my major, so maybe I was just in a weak CS department. Your mileage may vary.

And the meaning of worthwhile?

What do you mean by worthwhile? If you want to be a huckster taking cash and credit for other peoples work like Bill Gates and Steve Jobs -- well they didn't need degrees and you shouldn't either.

If you want to make serious contributions to the software community like Gary Kildall and Dennis Ritchie, I suspect their degree helped them a lot.

In my time I divided developers into two kinds--the "VB" crowd which designed a lot of forms and wrote as little code as possible, and the hardcore guys, mostly doing C or C++, who routinely do bit manipulation, choosing of algorithms, parsing blobs etc. The "VB" crowd are likely not to make use of a degree ( unless it's a degree in a business oriented field eg MBA ) except to earn more money.

Finally how are you expecting the degree to be worthwhile? Are you expecting the degree to:
1) earn you more money
2) help you develop better software
3) create more job opportunities.

One final thing to observe is that if I were heading up a company or division that was doing GIS software, I would expect all the senior developers, irregardless of their actually degree, to have had some coursework in topics like: topology, (abstract) algebra, analysis, number theory etc. Why? Because chances are that at sometime senior developers will have to read sections in books in computational geometry and I want to know they can handle them.

Might be useful

A degree IS definitely useful - if for no other reason than to get you past the HR droids that some companies employ. Their reasoning is "no degree = no brains".

Whether a CS one is more useful than, for example an MBA, really depends on what you're going to do. Of course, we'll no doubt get the (elitist?) argument that a CS degree means that you've had training in the basics of design etc. So by implication, if you've not got a degree then you're barred/incapable of studying those things for yourself? No, I don't think so either.

I've certainly seen folks who had that degree who I wouldn't trust to rewrite "Pong", and conversely I've seen "hobbyists" who turned out reams and reams of excellent code. Biggest advantage I've seen to a CS degree is that it can get someone "learned to learn" so maintaining those skills comes easier.

Actually continuing development is the answer - something that I've found UK businesses are very bad at. There's to much belief that training is "a waste of money". Of course, if you're serious about your craft then perhaps joining a professional body - such as the Institution of Analysts and Programmers is the answer.

In summary, in open source I don't see a CS degree as necessarily an advantage, it easily gets beaten by having an enthusiasts outlook.

(And no, I don't have a degree, and yes - I am an IAP member)

Why?

Do I want to work for a company that demands to see a degree for a job that is mostly practical?
If the recruiters are clueless about what computer science entails, will my manager be any better?

I know people who can rattle off some crackpot theory about computer science, but are absolutely useless at programming. A man is defined by what you do, and if you can’t actually program, your degree is a pointless pedigree to impress companies that don’t know better.

I didn't do Computer

I didn't do Computer Science, but I did a degree in the field that had a more practical focus than theoretical (it's title was Web Design and Internet Technology).

During my degree I learnt an awful lot that I wouldn't have picked up as a hobbyist (security, law, HCI etc), and it gave me a portfolio of experience to help me get a job.

That said, in my first year on the job I learnt more about actual web development than I learnt in 3 years at university.

If you get a job that's willing to train and mentor you, I don't think a degree in the field is vital if you've got self-taught skills. However, it's certainly valuable, and a lot of fun.

If you're looking at it from a "value for money" point of view given recent UK funding changes, you need to ignore the media's reporting and look at the facts for yourself. When I took my degree I was paying ~£3000 per year in tuition fees, and started paying back when my salary was above £18000. If I got my degree under the current system, I would currently be paying less money each month. The total "debt" you're in is academic (ho, ho) since no-one's compelling you to pay it off completely. It's more like a graduate tax.

Computer Science is not

Computer Programming.

Why do so many people equate computer science with programming? Computer science is applied math. Learning how and why a computer works. Learning what it can and cannot do. Learning how to write efficient algorithms, that can be implemented in any language. Some computer scientist don't learn how to program well, others do. Programming is incidental to computer science, not the crux of it.

When it comes time to write or choose an algorithm, the self-taught guy might be just as likely to choose an O(n) or god forbid O(c^n) when an O(log[n]) may be available. Why? Because he doesn't know any better. He may be really good and writing code, but is he good at designing algorithms or solving problems?

A computer cannot do anything the programmer cannot do, and most tasks on a computer are mathematical. Having that grounding in applied math really does make a big difference and it opens up a huge world of problems that cannot be solved, or solved correctly otherwise.

Having those skills is a far different thing that being a wizard at Ruby on Rails.

Computer Science Degree?

Hmmm, I would love to get a Computer Science Degree, valuable or not, but sadly it costs an arm and a leg to attempt to it, coupled with the fact that I am probably not good enough anyhow.

The few professionals I know that are in the industry are self taught so maybe there is hope for me yet...

The practical approach is much more important than a theoretical one but maths is needed for a lot of coding stuff and I suspect many coders are not that deep, maths wise.

Bazza...

On The Job + Self Taught + MIT Open CourseWare = Happiness.

If you're going to be any good at general programming you'll have a level of innate talent that can be channeled to self taught tasks (books, YouTube etc), which can then be supplemented with MIT Open Courseware if so desired.

Unless you LOVE math I would thus recommend taking one or two semesters of College Level Computer Science and getting a degree in Education. That way you can go:

1. 100% Corporate In Web Dev (full on career path)
2. Self Starter (what I've done, start my own business)
3. Teach if all else fails or when you're ready to retire : )
4. Mixture of all of the above.

Finally, as others have pointed out, their are general programmers like me, then their are people like Lubomir Bourdev. If you're that smart you already know that your not so much a programmer as a math/algorithm loving computer scientist. Two different things entirely. If you're that type then yes, a computer science degree is mandatory.

Value of CS degree

I know a fellow who is Phd (theoretical physics), employed as a web designer (self taught) so IMHO any degree should be an asset. Trouble is degrees mean debt for long term.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Post new comment

CAPTCHA
We can't accept links (unless you obfuscate them). You also need to negotiate the following CAPTCHA...

Username:   Password:
Create Account | About TuxRadar