Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Doom Marine

Time for me to learn programming languages, suggestions?

Recommended Posts

Jodwin said:

ROFL. Okay, these days you pretty much have to know Javascript if you're doing anything on the net, but it's a fucking horrible language.


Is even worse. Not to mention that PHP is, in practice, very limited in what you can do with it: It's for text manipulation.


I presume you missed the somewhat sarcastic nature of my post--but I concur, PHP is pretty tacky. Javascript is pretty core for the internet, and while it might be easier for someone new to programming to learn, it can easily develop some bad habits.

It really depends though on what you want to do. C/++ is a decent place to start for executable-based programming.

Share this post


Link to post

The harsh truth, IMO? A programming language alone won't take you very far, anymore, even if you're the absolute BEST living coder for that language alone AND NOTHING MORE beyond that. The IT job market today seems to be oriented towards specific APIs, frameworks and specialization in various business-oriented systems (e.g. CRMs, ERPs, etc.) so logic dictates that the "best" language to learn is the one that "unlocks" most of them.

Let's see. Java "unlocks" servlet programming, Android development, most Web frameworks (Struts etc.), most persistence frameworks (Hibernate etc.) which seem to be pretty much the only stuff in demand at this time. "Business computing", if you wish. If you choose to go that domain, you should probably ALSO learn at least one of the other "big" web languages: JavaScript, PHP, and maybe a bit of PERL, you never know.

C# comes as a far second, but it effectively means working exclusively with/on Microsoft stuff. This depends on your target job market, so you'd better check on that. Some countries seem to be Java/JavaScript/LAMP-dominated (most of them), while in others there are Sharepoint niches where you might squeeze in. A typical job ad requesting AT LEAST 10 different qualifications is more than enlightening.

C/C++ today seem relegated to system/game programming, so unless you truly want to spearhead into that domain, I'd say don't bother: you'll be pwned on the job market by a 1000 LAMP kiddies. Only exception where they might be helpful: as a stepping stone for Objective-C, which cn mean only one thing: OSX and/or iPhone development. But only if you're willing to go down that dark path.

Stuff like FORTRAN and MATLAB seem only useful in academia or relegated to some research positions (e.g. weather models, neural network modeling etc.) and are but a slim minority in the job market, with almost zero job prospects.

Share this post


Link to post

I should've made myself more clear about future prospects with coding. I'm not planning my way towards any IT related-career in the future. I'm currently a School of Medicine research intern, and I'm interested in creating a program that can give me a readout of how closely a given genetic sequence resemble an existing one in the database.

Of course there are such programs that exists already, but I'd like to explore programming as a hobby, kinda like how I first started out mapping as a hobby. My short-term goal is to play around with the programming languages and see where it takes me... who knows, you might even see another Doom source port many years down the road.

Share this post


Link to post

At least with C# you'll get a solid stepping stone/affinity for other curly-bracket languages like Java, C and C++ (you could consider C# to be Microsoft's half-breed C++/Java, pretty much).

Python currently occupies a niche as a "didactical" programming language, a niche once occupied by BASIC. That should tell you something about how far it can take you.

Share this post


Link to post

Yeah, but BASIC was fun. :-)

You didn't worry about libraries or APIs, frameworks, none of that bullshit. Just write some code and run. Some of them even had custom graphics/sound functions for that hardware (like the ones that shipped with Amstrad CPC, or Amiga Workbench 1.x), and you could always hack the machine code via PEEK/POKE. Hell, you were banging on the hardware at that point. That's something that has been lost, unless you're into embedded stuff, or OS drivers, etc.

Share this post


Link to post
Guest DILDOMASTER666
hex11 said:

Yeah, but BASIC was fun. :-)

You didn't worry about libraries or APIs, frameworks, none of that bullshit. Just write some code and run. Some of them even had custom graphics/sound functions for that hardware (like the ones that shipped with Amstrad CPC, or Amiga Workbench 1.x), and you could always hack the machine code via PEEK/POKE. Hell, you were banging on the hardware at that point. That's something that has been lost, unless you're into embedded stuff, or OS drivers, etc.


This is of course coming from the man who hasn't owned a mouse in ~15 years and browses Doomworld on a 486 through a text-only web browser

Share this post


Link to post
hex11 said:

Yeah, but BASIC was fun. :-)

You didn't worry about libraries or APIs, frameworks, none of that bullshit. Just write some code and run. Some of them even had custom graphics/sound functions for that hardware (like the ones that shipped with Amstrad CPC, or Amiga Workbench 1.x), and you could always hack the machine code via PEEK/POKE. Hell, you were banging on the hardware at that point. That's something that has been lost, unless you're into embedded stuff, or OS drivers, etc.

There's a very good reason why that stuff has been lost from most modern software development. :P

Share this post


Link to post
hex11 said:

That's something that has been lost, unless you're into embedded stuff, or OS drivers, etc.

Which is done in C with maybe some assembly.

Share this post


Link to post

hex11 said:
Yeah, but BASIC was fun. :-)

You didn't worry about libraries or APIs, frameworks, none of that bullshit. Just write some code and run. Some of them even had custom graphics/sound functions for that hardware (like the ones that shipped with Amstrad CPC, or Amiga Workbench 1.x), and you could always hack the machine code via PEEK/POKE. Hell, you were banging on the hardware at that point. That's something that has been lost, unless you're into embedded stuff, or OS drivers, etc.


A while back, I learned 650x assembly and BASIC just so I could tinker with my C64.

I didn't get very far before I realized how time-consuming and impractical it was to do something like that, but I still enjoyed it. :P

Share this post


Link to post

C# is a fairly steep learning curve for a beginner and a poor choice for the same reasons Java would be -- it forces some syntactical choices on programs that a beginner would have very little (if any) understanding of. Python (and probably Ruby too) lets you take away all that cruft and use raw statements in the beginning, then you can begin to use functions and classes/objects and modules/packages in the future as you learn them through a natural path.

Share this post


Link to post
Stygian said:

A while back, I learned 650x assembly and BASIC just so I could tinker with my C64.

I didn't get very far before I realized how time-consuming and impractical it was to do something like that, but I still enjoyed it. :P


It doesn't have to be BASIC, any language that lets you hit the hardware to some degree will do the trick. For example, homebrew dev on Nintendo DS is done mostly in C, with some bits in ARM assembly.

Oh, and John Carmack understands (no surprise, since he grew up hacking on 8-bit machines):
http://web.archive.org/web/20090131155500/http://thechuckster.homelinux.com/index.php?id=7

Share this post


Link to post
chungy said:

C# is a fairly steep learning curve for a beginner and a poor choice for the same reasons Java would be -- it forces some syntactical choices on programs that a beginner would have very little (if any) understanding of. Python (and probably Ruby too) lets you take away all that cruft and use raw statements in the beginning, then you can begin to use functions and classes/objects and modules/packages in the future as you learn them through a natural path.

This is very much a personal thing though: Which is easier to understand, "print" or "console.writeline"? I'd say that the latter is much more explicit, and in that way easier for a beginner. It could be more difficult if you're one of those people that get stuck on every little detail from the very beginning, but if you aren't...

Python is a good beginner's language because you can just jump into it and start writing code while the language takes care of good practices for you. But it also has some features that are bad in the long run, namely dynamic variables. I guess some new coders would whine about static typing being too restrictive, but it's that same restriction what saves them a metric ton of eventual errors that'll be difficult to track down - especially with their experience (or, rather, lack of).

Share this post


Link to post

"Print" is an antiquated leftover from the era before screens were invented; and computer programs produced their output on the teleprinter. (Another funny leftover from this is how the "linebreak" symbol is divided into two characters, Carriage Return and Line Feed, so that it would give enough time for the printer to move its arm all the way back to the left. That's why DOS-format text uses CRLF, while Unix format kept only CR and Mac format kept only LF.)

Kinda like the "save" icon is still a 3½ floppy disk. When was the last time you used one, 10 years ago maybe?

Share this post


Link to post

console.writeline() may sound more logical (that's debatable) than just print(), but who wants to go through the trouble of writing that much over and over again all the time? You're going to be using that a hell of a lot, not just to output stuff to console, but files, sockets, etc. so why not keep it short and sweet, to make programming more efficient? Heck, some BASIC dialects even provided a shorthand way so say print:
? "Hello world"

It really just takes 30 seconds for a n00b to learn what "print" means. Trying to enforce verbosity carries the built-in assumption that humans are incapable of learning basic syntax for a language they're going to be using for many years to come, and thus should be punished during all those years by having to write lots of extra characters. Whoever invented these languages is a real sadistic bastard, for sure!

Share this post


Link to post

That's why you end up doing something like #define log console.writeline and then you can type log("hello world");

Dunno if there are #defines in C#, though. All I know about C# is that it's Microsoft's attempt at making a "better" Java, that is to say, a Java that only works on Microsoft platforms.

Share this post


Link to post
Gez said:

That's why you end up doing something like #define log console.writeline and then you can type log("hello world");

Dunno if there are #defines in C#, though.

There are - but they aren't as powerful as they are in C. It isn't possible to do what you describe for example. #defines are pretty much limited to conditional compilation.

All I know about C# is that it's Microsoft's attempt at making a "better" Java, that is to say, a Java that only works on Microsoft platforms.

It's more like "making a version of Java that the control" - Microsoft used to develop Java development tools until they got sued by Sun. I can't say I really blame them for going down the C# route - they obviously realised that the Java approach is useful but that it's dangerous for them to invest in a technology owned by one of their competitors.

chungy said:

C# is a fairly steep learning curve for a beginner and a poor choice for the same reasons Java would be -- it forces some syntactical choices on programs that a beginner would have very little (if any) understanding of. Python (and probably Ruby too) lets you take away all that cruft and use raw statements in the beginning, then you can begin to use functions and classes/objects and modules/packages in the future as you learn them through a natural path.

I don't think it's that bad to be honest. I agree that Python would be a better choice of first language but you can do a lot worse than C#.

Share this post


Link to post

All n00bs should be locked in a room with a text-only OpenBSD machine*, a copy of Learning Perl, and daily pizza/beer rations until they can crank out a working web server in pure Perl (no CPAN modules allowed).

* base system without the X11 stuff, and no packages

Share this post


Link to post

If you're programming in C# you're probably using some incarnation of Visual Studio. With that you have Intellisense.

Share this post


Link to post
hex11 said:

console.writeline() may sound more logical (that's debatable) than just print(), but who wants to go through the trouble of writing that much over and over again all the time?

But you don't. Modern IDEs' intellisense keeps track of what you write, and if "console" is the most common word starting with 'c' that you write, and "writeline" is the most common function you use for console, you can just press c,enter,w,enter. :P

Also, Console.WriteLine() writes to the console, not to sockets or files or streams. There's

MemoryStream ms = new MemoryStream();
ms.Write(...);
and others for that. As for verbosity, the point is in writing self-documenting code. There's a very good reason why "int fuckyou = something" or "int a = something" is frowned up, opposed to "int carMass = something".

Share this post


Link to post

Yeah, vim got that keyword/var/whatever completion stuff now too. It also has a built-in spellchecker. It's slowly turning into Emacs. :-o

But I still don't like the extra verbosity, since it doesn't really clarify things IMO. I mean, if your default print() goes to stdout, then it's pretty obvious what's happening when you write something like this:

open(LOG, '>', $logfile) or die;
print LOG $logmsg;

Variable names is another story, the coder controls all that. Heck, even Perl gives you the option of using the English module, which defines alternative names like $INPUT_RECORD_SEPARATOR instead of $/, even though the shorthand versions are mnemonic and easy to remember after using them a bit.

Share this post


Link to post
Jodwin said:

But you don't. Modern IDEs' intellisense keeps track of what you write, and if "console" is the most common word starting with 'c' that you write, and "writeline" is the most common function you use for console, you can just press c,enter,w,enter. :P


That doesn't happen for me. The first 'C' word I get is const, so I have to type "conso." The first 'W' I get within Console is "WindowHeight." The end result is something like "conso"+TAB+".wr"+TAB+DOOWN+TAB.

Share this post


Link to post

Follow-up:

 

2014: Finally put down the books, and used CodeAcademy to finish the Python lesson module.

2015: Got a written warning from my biotech lab: it literally read "Programming while working"

2015: This sucks, I need a career change. 

2016: First day of class as a CS undergrad, age 30.

2016: Interviewed by Boeing: "Do you have a website?" ... me: "Give me like a weekend"

2016: https://busybeaverhp.github.io/

2017: Got hired at Boeing as an Information Security Engineer before graduation.

2017: Graduated valedictorian with 4.0.

2017 - Turned down a software internship from Microsoft, which paid more than Boeing... but didn't allow me to work from home like Boeing did.

2018: Earned enough FU money to not work anymore.

2019: DVII continues.

 

Lesson:

 

1. IT has a much faster ROI than med.

2. Having done both, Technical college much much higher ROI than Uni.

2. Anyone is capable of this, I started my journey after turning 30.

Edited by Doom Marine

Share this post


Link to post

Being new to programming, one of my favorite tools to leverage smarter people is bookmarking:

 

Untitled-1.png.15952d516e067539dbb82383405ec8a7.png

 

It's been a year since I wrote any code... if anyone asks about programming, I dunnochit

Share this post


Link to post

I'm happy by your success and would like to make some questions if you don't mind. How much time  end up taking to learn your first programming language?And how much your studied everyday and how? I'm starting to learn my first programming language(javascript) for game development as hobby and would like to jump to C# and C++ next, so if you have something to say about it, i would be glad to hear.

Share this post


Link to post

Thanks, I forgot to add another highlight:

 

2017 - Turned down a software internship from Microsoft, which paid more than Boeing... but didn't allow me to work from home like Boeing did.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×