Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Cacodemon345

What do you think of the C-family programming languages?

Recommended Posts

20 hours ago, Jon said:

 

Imagine if loads of tree surgeons used a chainsaw without safety guard and kept losing fingers. You wouldn't blame all the tree surgeons for being sloppy, you'd make sure chainsaws have proper guards and safety cut-outs.

 

 

That's a bit of far fetched bullshit.

Please provide a less hyperbolic argument.

Share this post


Link to post
6 hours ago, Xcalibur said:

in addition, different languages have different strengths. as someone once put it to me, you wouldn't code a web app in C++, and you wouldn't code an OS in Java.

 

With WebAssembly you actually can write parts of web apps in C or C++, and I think such use cases will actually increase.

 

Share this post


Link to post
18 hours ago, Ferk said:

Yes. Any attempts have failed so far, except targetted solutions. Also things like image recognition, translation, machine learning, etc, they are basically programs coding other programs for very targetted and specific applications. Yet in several cases it resulted in algorithms much more accurate than what any coder could have possibly written manually.

 

Hmm...dunno, but I regard tasks such as recognizing an image (or detecting a pattern/finding the optimal solution to a mathematical or engineering problem) to be more like a kind of mathematical function or DSP application, not a program in the traditional sense. Such problems, yes, those are suited to machine learning/evolutionay algorithms (but not all of them, AI-based methods are often inefficient compared to a traditional algorithm, assuming that one can be found).

 

The difference being that a complete program also has to interact with the OS, with the user, draw pretty graphics, print meaningful messages in a human language etc. and usually there are pesky stuff like "usability", "interface design" and "customer support" to deal with. Sure, you could deal with those issues using bolt-on parts, as usual, but then who writes those parts? ;-)

 

Of course, if you take those pesky humans and their frivolous specs out of the equation completely, maybe only then will AI coding realize its full potential.

Share this post


Link to post
50 minutes ago, Graf Zahl said:

 

With WebAssembly you actually can write parts of web apps in C or C++, and I think such use cases will actually increase.

 

 

I actually coded (part of) a webapp in C# for work a few years back, through the .NET framework so, yeah.

 

17 minutes ago, Maes said:

Of course, if you take those pesky humans and their frivolous specs out of the equation completely, maybe only then will AI coding realize its full potential.

this http://www.digitaljournal.com/tech-and-science/technology/a-step-closer-to-skynet-ai-invents-a-language-humans-can-t-read/article/498142

 

Share this post


Link to post
21 hours ago, Graf Zahl said:

I'm very pragmatic here. It's easier not to use those features of a language which I don't like but trying to make do with a limited language that is simply lacking in things I need for comfortable development. In that regard, I'd always choose C++ because nobody is forcing me to use iostreams or some of the more inept features of the standard library, or god forbid - Boost. 

 

Right, but for my engine, the goal was to be compiled as both an SDL app and a RetroArch plugin DLL, and RetroArch runs on all of these platforms.  If wanted to be portable, I only had two options - ANSI C or C++98.

 

I suppose that if you are 100% sure that your program or library will always be compiled with a C++ compiler from this decade that is relatively free of bugs, then sure, I can see how the case for C is pretty much nil at this point.

Share this post


Link to post
2 hours ago, AlexMax said:

 

Right, but for my engine, the goal was to be compiled as both an SDL app and a RetroArch plugin DLL, and RetroArch runs on all of these platforms.  If wanted to be portable, I only had two options - ANSI C or C++98.

 

I'd still choose C++ then. Even that old dialect is magnitudes better than naked C.

But wow, isn't RetroArch taking the "retro" part a bit far for supporting platforms that have no viable use anymore and in the process sacrificing convenience on the relevant ones for the need to still work on obsolete compilers?

Share this post


Link to post
2 hours ago, Graf Zahl said:

I'd still choose C++ then. Even that old dialect is magnitudes better than naked C.

 

There are a couple of features from C++98 that I missed, but many more features I coveted were from newer standards  Besides, the reason I stopped work on the project had nothing to do with C - the lack of classes and std::string was a minor inconvenience, but a manageable one.

 

2 hours ago, Graf Zahl said:

But wow, isn't RetroArch taking the "retro" part a bit far for supporting platforms that have no viable use anymore and in the process sacrificing convenience on the relevant ones for the need to still work on obsolete compilers?

 

Usually when a device is jailbroken, if Doom is one of the first things ported, RetroArch is usually not far behind.

 

That being said, it's more than just retrocomputing.  A friend of mine puts food on the table writing C for embedded devices, and from what I've been able to make out C++ is still not common in the areas he works in.

Share this post


Link to post
37 minutes ago, AlexMax said:

That being said, it's more than just retrocomputing.  A friend of mine puts food on the table writing C for embedded devices, and from what I've been able to make out C++ is still not common in the areas he works in.

 

Understandable, because these devices cannot work with languages that generate code behind the programmer's back. With the limited resources available the programming needs are vastly different than on real computers.

Share this post


Link to post
20 hours ago, Graf Zahl said:

 

With WebAssembly you actually can write parts of web apps in C or C++, and I think such use cases will actually increase.

 

granted, it's not perfectly clear-cut. my point is that there won't be a single, perfect language, since there are pros and cons and different things to emphasize.

Share this post


Link to post
7 hours ago, AlexMax said:

That being said, it's more than just retrocomputing.  A friend of mine puts food on the table writing C for embedded devices, and from what I've been able to make out C++ is still not common in the areas he works in.

Being someone who touched EFI development, it is understandable indeed, considering the amount of effort required to even make a C++ standard library and the enormous size C++ wouldn't be a good option for low-level/embedded development.

Share this post


Link to post

If for one moment we stick just to the syntactic aspect of curly-brace languages, very few (I believe) would argue that their approach isn't the best for visually/intuitively structuring code, in a way that makes both the compiler's and the programmer's job easier. Unless you consider the Pascal/Delphi style, Python (ugh...), BASIC or, worse, Fortran (the non-free-form kind) to be superior.

Share this post


Link to post

Except those fools who think that English words are easier to understand. Which results in languages like Lua or, if we go back far into the past, COBOL.

 

Share this post


Link to post
8 minutes ago, Graf Zahl said:

Except those fools who think that English words are easier to understand. Which results in languages like Lua or, if we go back far into the past, COBOL.

 

COBOL is great. (disclaimer : it ain't I'm just lying to myself 'cause I have to finish a program but don't feel like it)

Share this post


Link to post

And Lua doesn't seem too hard to understand either, but I assume its problems or limitations lie elsewhere:

 

screenshot.png

Share this post


Link to post

The use of more words in Lua never bothered me, if code is properly indented then you can see the structure regardless if it is words or curly brackets.

 

Lua's main limitation, especially for large projects, is that it is a dynamically typed language and does not have C-like structs where you get a compiler error if you mis-type a field name.  For smaller things and one-off scripts, it is still pretty good (and better than shell scripting by a long way).

Share this post


Link to post

LUA seems to use a variation of the Fortran/Pascal style of blocks delimiting (which are by no means equivalent, just visually similar). It works, but it gets long in the tooth at some point.

 

This style of indentation/block delimiting also has the disadvantage that it's harder to match/align blocks and automatic block highlighting/folding either isn't available or works unreliably, depending on the IDE. It's one thing to count curly braces, and another to match "end" statements with their corresponding block opening statement, especially in languages that don't have matched "end" statements, e.g. if/endif, do/enddo etc., unless you also have perfect whitespace indentation. MATLAB's language seems to be worst offender of the kind, as it somehow manages to both adopt the worst aspects of FORTRAN syntax, AND dumb it down evey further.

 

Also, let's say that reading all those "end" statements at the end (pardon the pun) of a long day of coding isn't always the best :-)

 

As for the use of English....this may sound like an oldschool opinion but back in the day you simply didn't start using a computer, in any form of capacity, if you weren't familiar with English (for those whose first language isn't English). The basic OS, any programming language, the vast majority of application software, Hell, even video games would be in English. And if you weren't , you'd soon be, at least with a functional subset of it. Not unlike e.g. the universal use of English in aviation.

Edited by Maes

Share this post


Link to post

The "everything is a hashtable" in lua is also problematic with arrays, it's hard to trust if the size count is reliable, a "nil" value inside the array will make the size calculation stop in that element.

 

It also has some very weird quirks (like being 1-based instead of 0-based) and in general the language is a bit more tedious to write than something like python (not just because of the syntax, but lacking some conveniences like some standard methods not returning the value back, no "+=" operators and things like that which force you to be a bit more verbose in some moments were using those would have helped with clarity).

Edited by Ferk

Share this post


Link to post
Just now, Ferk said:

a "nil" value inside the array will make the size calculation stop in that element

Which I believe was done to mimic the termination of C strings with a null character.

 

Some of the table syntax in Lua is also inherited from C.

Example:

a = 
{
  [1] = 5;
  [2] = 6;
}

C-only equivalent (not allowed in C++):

int a[40] = {[2] = 6, [4] = 12};

 

Share this post


Link to post
1 hour ago, seed said:

And Lua doesn't seem too hard to understand either, but I assume its problems or limitations lie elsewhere:

 

screenshot.png

 

As your example shows, you need a lot of colors to make that code digestable. Imagine the same with less intrusive colors and you'll have a hard time separating the boilerplate from actual content.

 

23 minutes ago, Maes said:

As for the use of English....this may sound like an oldschool opinion but back in the day you simply didn't start using a computer, in any form of capacity, if you weren't familiar with English (for those whose first language isn't English). The basic OS, any programming language, the vast majority of application software, Hell, even video games would be in English. And if you weren't , you'd soon be, at least with a functional subset of it. Not unlike e.g. the universal use of English in aviation.

 

Guess where I learned a lot of English from. But that wasn't my point. A brace is a universal symbol, everybody will immediately match opening braces with closing braces and see the inside as an inner block - because those braces naturally make a pair. Whereas with keyword based languages you have to mentally process the keywords as well to understand the code, and of course some knowledge of which keywords can be paired to declare a block.

 

Share this post


Link to post
18 minutes ago, Cacodemon345 said:

Which I believe was done to mimic the termination of C strings with a null character

 

At least in C it only applies to specific methods that deal with strings, not everywhere, and you can use sizeof:

int a[40] = {[2] = 6, [4] = 12};
printf("size: %lu\n", sizeof(a)/sizeof(int)); // size: 40

I'm not sure if you can do something like that in Lua

a = {[2] = 6; [4] = 12}
print("size:", #a) -- size: 0
print("size:", table.getn(a)) -- size: 0

 

Share this post


Link to post

Lua is essentially just Javascript with an uglier syntax. IMO its only selling point is that its easier to embed.

Share this post


Link to post
6 hours ago, andrewj said:

if code is properly indented then you can see the structure regardless if it is words or curly brackets.

 

[Python flashbacks]

Share this post


Link to post
7 hours ago, andrewj said:

if code is properly indented then you can see the structure regardless if it is words or curly brackets.

I have seen people opposing this exact idea however.

Share this post


Link to post
7 minutes ago, Cacodemon345 said:

I have seen people opposing this exact idea however.

Besides, real programmers use only raw machine code, without pesky things like macros, free-flow syntax, comments etc.

Share this post


Link to post

I don't think that Lua's use of words to end blocks is a problem, especially because it nearly universally uses only a single one...end.  Then again, as far as blocks go, BASIC's aren't terrible either, since it's nearly always END whatever.  What I find most objectionable are inconsistent cutesy ones like fi and esac and done used by bourne shell - they're not even consistent.  I also have no love for Python's indentation blocks, since they tend to result more crashes and unexpected behavior at runtime as opposed to a simple "Unclosed block" at compile time.

 

If there's a problem with Lua, it's 1-based indexing, and I very much would have preferred if that misfeature wasn't baked into so many aspects of the language and standard library.  it might have been a reasonable idea in isolation, but the problem becomes especially nasty when you start porting code from nearly every other language out there to Lua, it's a minefield for bugs and off-by-one errors.  A damn shame, because nearly everything else about the language I find fit-for-purpose when it comes to embedded scripting languages, and whatever other gripes I have with it are mere bikeshed-fodder.

 

I was looking at Wren as an interesting alternative embedded language that prioritized speed and had braced syntax, but the original maintainer became busy and it's only recently that a new set of programmers have started committing to the language regularly.

Share this post


Link to post
13 minutes ago, AlexMax said:

I was looking at Wren as an interesting alternative embedded language that prioritized speed and had braced syntax,

 

Just use JavaScript, the embedded kind of (ie QuickJS). It's a lot more standardized, well-known, and *will* tend to render better results for the time being, for things like community plugins and such.

 

Not to detract from Wren, though. It is also an awesome project, very interesting and promising.

 

4 hours ago, Maes said:

Besides, real programmers use only raw machine code, without pesky things like macros, free-flow syntax, comments etc.

 


Real programmers just get the job done, no matter the code. If the job in question is very low-level, like say an embedded device or oldschool processor, like writing code for a MOS 6502, then assembly might be the tool for the job. But stating that "only assembly programmers are real programmers" is unnecessarily elitist, and purely counterproductive.

 

It has no place in an exponentially growing ecosystem with exponentially expendable CPU time, like any modern system ever. It also makes maintenance and particularly extensibility impossible. That is the reason C is even a thing, and why, even with all its hurdles, it's still a godsend in comparison to writing in pure assembly. That's why C is still a thing after such a long time, even after the introduction of Rust. It's also not portable.

 

Granted, I'm not a big fan of C either. I'm really not a fan of any programming language. 90% of everything is shit. The programming language, and its software and ecosystem, is merely the tool, not the product. A rusty but well-crafted drill can make houses better than a shiny drill with all screws loose that leaks deadly electric current.

 

Edited by Gustavo6046

Share this post


Link to post
9 hours ago, Gustavo6046 said:

Real programmers just get the job done, no matter the code. If the job in question is very low-level, like say an embedded device or oldschool processor, like writing code for a MOS 6502, then assembly might be the tool for the job. But stating that "only assembly programmers are real programmers" is unnecessarily elitist, and purely counterproductive

 

Sigh, the more time passes the less new programmers "get" the good old fashioned humor of the Jargon File and the sheer epicness of The Story Of Mel. But in a day and age where script kiddies write fake ransomware just to piggyback on the real thing's scare...we've come a long (?) way, haven't we?

Share this post


Link to post
33 minutes ago, Maes said:

 

Sigh, the more time passes the less new programmers "get" the good old fashioned humor of the Jargon File and the sheer epicness of The Story Of Mel.

 

Just seeing the term "Real programmer" being used this cluelessly makes me cringe.

Over the years I met my good share of people who shared some of the traits being mentioned in these essays and they were always a menace for code stability and a nuisance to work with - often they are magnitudes worth than the newbies.

Their entire attitude exemplifies irresponsibility towards their job.

 

Share this post


Link to post

Well, "real programmers" in the above sense are much like a capricious but talented ace pilot or football player. Their talent is what allows them to "get the job done" (and more often than not, they do get it done, especially if it's something none else would touch with a 30-foot cattle prod) but also it means that they won't always be good team players or work well within strict procedure or company protocol. It's not even about using esoteric programming languages or refusing to use GUIs, but more about thinking outside the box and/or having a passionate perseverance.

 

They do have their uses, but employing them in a manner that will restrain their talent/unique skills is a management error. There are many "run of the mill" or "suit" programmers if one just wants compliance and adherence. Real programmers are best reserved for those special tasks none else will tackle.

Share this post


Link to post
On 7/22/2020 at 11:11 PM, Graf Zahl said:

With the limited resources available the programming needs are vastly different than on real computers.

 

1 hour ago, Graf Zahl said:

Just seeing the term "Real programmer" being used this cluelessly makes me cringe.

 

I hate both "real programmer" and "real computer"

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×