Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Quasar

Metatables

Recommended Posts

I read most of it, but didn't find it educational. Mostly I wondered how this system would fit into the whole program, and what benefits it would bring.

Personally I would rather use C++ and std::map rather than roll my own associative containers :-D

Share this post


Link to post

Me, too. ZDoom also has a map class that's infinitely preferable to going toward such lengths to do it in a programming language that's not suited to the task.

But remember, Eternity is still C code and apparently there's some considerable reluctance to transition to C++. I have no idea why this seems such a big no-no. Fixing all the syntactic problems is a finite task that's certainly far less work intensive than having to manually roll out complex concepts in plain C that'd come naturally in C++.

Personally I'd take the plunge. The rewards are far greater than any work that'd be needed to port the code. Remember: Porting to C++ does not mean to change any of the existing data structures to use C++ concepts. One can always do that later should it become necessary.

Share this post


Link to post
andrewj said:

I read most of it, but didn't find it educational. Mostly I wondered how this system would fit into the whole program, and what benefits it would bring.

Personally I would rather use C++ and std::map rather than roll my own associative containers :-D

Same here. To be honest something like this "metatable" solution to OO concepts would be my last resort, not my first choice. Doomsday is currently in the process of migrating to C++ and yes, its a LOT of work (only because we are choosing to re-architect in the process mind, nothing to say you must) but I would much prefer the upfront time commitment than a solution such as this.

Is there any particular reason(s) why you are opposed to switching to a more suitable language, like C++?

Share this post


Link to post

Wow thanks Quasar, it was a very informative read. Particularly because I myself am learning C right now(K&R book). I didn't know Eternity was C only, I think I might tinker around with it then, make some patches :)

(I am planning to learn C++ later, of course, but I've got to learn enough C to be comfortable in that first.)

Share this post


Link to post

I like C++ and the things it has to offer, but converting EE to it would be a bigger task than immediately obvious. Here are all of the things I've noticed that would have to change:

  • Use of enumerations is restricted in C++ such that assigning enum values to int variables gives a warning.
  • There are still places that use sizeof() on enum variables or put the enum variables into structures such that their size is implicitly relied upon. C uses "int" for enums, but C++ doesn't guarantee this.
  • The cheat code (which is a horrible mess) still uses K&R function prototypes to "cheat" (ironically) function pointer type checks. I do actually intend to rewrite this code at some point though...
  • Pointer type checking is painfully more strict, requiring casts to be added to practically every expression involving more than one type.
  • C99's new standard-width integer types are still not a part of any published C++ standard, and we *just* finished moving to these to help port the program to 64-bit.
  • C++ differs in some subtle ways when it comes to the standard behavior of mathematical operations that might break the program in ways that are hard to predict or fix.
  • Some GNU C extensions used for the Linux build aren't C++ compatible and would need to be refactored.
That's a small list. Some of those are worse than others. None of them are insurmountable, of course. It's mostly a matter of having already invested the work to build a set of useful C types, which really wasn't that hard or difficult. I finished the metatable structure in two nights (though I have tweaked on it a bit since then), so it wasn't exactly anything difficult or time-consuming.

To me at this point it just seems best to continue to maintain the architecture of the engine as-is, and not turn it into a hybrid of C-structured code calling into C++ objects. I understand some of you will disagree. It's just what's working for us right now :)

Share this post


Link to post
Quasar said:

[list]

  • Use of enumerations is restricted in C++ such that assigning enum values to int variables gives a warning.


  • I never experienced that. What compiler? ZDoom uses this in large quantities throughout the code and there'S no warnings.


  • There are still places that use sizeof() on enum variables or put the enum variables into structures such that their size is implicitly relied upon. C uses "int" for enums, but C++ doesn't guarantee this.


  • Easily solved by adding a dummy filler value to all enum declarations (see how Windows does it.)

    Quasar said:

  • The cheat code (which is a horrible mess) still uses K&R function prototypes to "cheat" (ironically) function pointer type checks. I do actually intend to rewrite this code at some point though...


  • Definitely fixable.


  • Pointer type checking is painfully more strict, requiring casts to be added to practically every expression involving more than one type.


  • That will definitely be the bulk of the work. Still worth it.


  • C99's new standard-width integer types are still not a part of any published C++ standard, and we *just* finished moving to these to help port the program to 64-bit.


  • In GCC you have them and in MSVC you can easily declare them using the MS specific types doing the same. Do you need any other compiler support? I believe with these 2 you already cover 99.9% of all potential platforms.


  • C++ differs in some subtle ways when it comes to the standard behavior of mathematical operations that might break the program in ways that are hard to predict or fix.


  • Care to explain? That's news to me.


  • Some GNU C extensions used for the Linux build aren't C++ compatible and would need to be refactored.


  • That's bad. But still no show stoppers.
    What did you use, for example, that would be a problem.


    To me at this point it just seems best to continue to maintain the architecture of the engine as-is, and not turn it into a hybrid of C-structured code calling into C++ objects. I understand some of you will disagree. It's just what's working for us right now :)


    Ok, it's your choice. But from my long experience as a programmer this sounds like solving a short term problem without really thinking ahead. This will add needless code complexity (as you have to explicitly spell out lots and lots of stuff that C++ naturally handles) and may block long term maintainability. Trust me, it will come back biting you in the ass later - and then it may be too late to change over and you may regret your decision.

    I can tell you that every single time I worked like that I regretted it later. Much code became unmaintainable eventually and thanks to the poor planning in the early stages that initially made my work easier I ended up with having to go back to the start later anyway. So a lot of work had to be done double because I didn't do it right when I could have. Some of these bad decisions from years back still are in the current GZDoom source because redoing it would be a massive undertaking.

    An example: The wall rendering code has become an impenetrable mess over the years and is in desperate need of a rewrite. The data structure used by it is crap and not particularly well suited to what the engine really needs but there's so much depending on it that every single time I tried to change it I gave up sooner or later.

    Share this post


    Link to post
    andrewj said:

    I read most of it, but didn't find it educational. Mostly I wondered how this system would fit into the whole program, and what benefits it would bring.

    As for this, here are some ways we're using it in EE already:

    // in E_ProcessDamageFactors...
    MetaSetDouble(info->meta, E_ModFieldName("damagefactor", mod),
                  cfg_getfloat(sec, ITEM_TNG_DMGF_FACTOR));
    
    // in P_DamageMobj:
    double df = MetaGetDouble(target->info->meta, 
                              E_ModFieldName("damagefactor", emod), 
                              1.0);
    damage = (int)(damage * df);
    
    // in E_AddMetaState:
    MetaAddObject(mi->meta, name, &newMetaState->parent, newMetaState, 
                  METATYPE(metastate_t));
    
    // in P_KillMobj:
    if(mod->num > 0 && (state = E_StateForMod(target->info, "Death", mod)))
       st = state->index;
    
    So as you can see, our current application of the metatable is to extend mobjinfo with information that is generated at runtime, such as DECORATE states, damage factors, item drop types, etc. All of these things have the requirement that I store N of them and must be able to retrieve a particular one later one. They were previously being implemented as separate, private lists, and this was grossly inefficient in terms of the amount of code needed. Now they all share the same common structure.

    This perhaps is the largest benefit of the metatable so far:
    // in E_CopyThing (used to implement mobjinfo inheritance):
    
    MetaCopyTable(this_mi->meta, mobjinfo[pnum].meta);
    
    This used to be a series of 4 or 5 deep separate deep list copy routine calls, all implemented in different modules using only subtly different algorithms and data structures.

    Share this post


    Link to post

    Sorry for the possible double post but I cannot justify ramming these two messages together, as they are completely unrelated and are both rather long.

    Graf Zahl said:

    I never experienced that. What compiler? ZDoom uses this in large quantities throughout the code and there'S no warnings.

    I believe it's contingent on whether or not the enum is defined as part of a typedef or not. But EE makes extensive use of such typedef names, and unfortunately, some of the code (mainly code we didn't write ourselves) is inconsistent about its application, such that some of the time the values are written into plain old integers.

    Graf Zahl said:

    Easily solved by adding a dummy filler value to all enum declarations (see how Windows does it.)

    Not sure how that works, got an example?

    Graf Zahl said:

    In GCC you have them and in MSVC you can easily declare them using the MS specific types doing the same. Do you need any other compiler support? I believe with these 2 you already cover 99.9% of all potential platforms.

    If that's really considered allowable. It seems like a dangerous assumption to make that doing this won't harm the future portability of the program.

    Graf Zahl said:

    Care to explain? That's news to me.

    Apparently I imagined this since I'm having trouble finding any reference to it now. Mark one off the list maybe? ;)

    Graf Zahl said:

    That's bad. But still no show stoppers.
    What did you use, for example, that would be a problem.

    Seems it's confined to the __attribute__((packed)), which is not supported in C++. It's only one thing, but it's used *all over the place*, including in the Small interpreter :(

    Graf Zahl said:

    Ok, it's your choice. But from my long experience as a programmer this sounds like solving a short term problem without really thinking ahead. This will add needless code complexity (as you have to explicitly spell out lots and lots of stuff that C++ naturally handles) and may block long term maintainability. Trust me, it will come back biting you in the ass later - and then it may be too late to change over and you may regret your decision.

    I can't really see that being the case with the metatable in particular because it is both "finished" and replaceable. The way it is designed would make it easy to gut out the hash table it depends on and replace it with something like std::map if you felt like doing that.

    I think the design in this case is good, and will last us. After all, it takes some inspiration in terms of the interface from the Meta routines that are in ZDoom. I just sought a way to implement a similar data structure in pure C using the efficient generic list structure we already had as a basis.

    Share this post


    Link to post
    Quasar said:

    I believe it's contingent on whether or not the enum is defined as part of a typedef or not. But EE makes extensive use of such typedef names, and unfortunately, some of the code (mainly code we didn't write ourselves) is inconsistent about its application, such that some of the time the values are written into plain old integers.


    All I can say is that ZDoom is full of such assignments and they never ever caused even a hint of a warning. Assigning ints to enums is an error, for sure, but enum to int is a perfectly valid construct to my knowledge.


    Not sure how that works, got an example?


    Windows adds a field 'force_dword = 0x7fffffff' to most enums.


    If that's really considered allowable. It seems like a dangerous assumption to make that doing this won't harm the future portability of the program.



    The C99 types are just typedefs you can find in stdint.h for GCC.

    All GCC does is:

    typedef signed char int8_t;
    typedef unsigned char   uint8_t;
    typedef short  int16_t;
    typedef unsigned short  uint16_t;
    typedef int  int32_t;
    typedef unsigned   uint32_t;
    typedef long long  int64_t;
    typedef unsigned long long   uint64_t;
    
    And this header file won't go away any time soon so I don't see the problem. Typedefs are perfectly legal in C++ after all.


    Seems it's confined to the __attribute__((packed)), which is not supported in C++. It's only one thing, but it's used *all over the place*, including in the Small interpreter :(


    GCC supports #pragma pack. ;) I never realized this before but one critical file in ZDoom used #pragma pack without any separate GCC handling - and it works fine. Otherwise no Linux build could work at all!


    I can't really see that being the case with the metatable in particular because it is both "finished" and replaceable. The way it is designed would make it easy to gut out the hash table it depends on and replace it with something like std::map if you felt like doing that.

    I think the design in this case is good, and will last us. After all, it takes some inspiration in terms of the interface from the Meta routines that are in ZDoom. I just sought a way to implement a similar data structure in pure C using the efficient generic list structure we already had as a basis.


    This was not aimed at this particular feature. I just see that you add increasingly complex stuff to your C code. This will continue to work for a while, sure. But when I see your document I just ask myself 'Is this all really necessary? Wouldn't it be better to convert the code to a language that's more naturally suited to what you do?'

    Share this post


    Link to post
    Graf Zahl said:

    And this header file won't go away any time soon so I don't see the problem. Typedefs are perfectly legal in C++ after all.

    I'm more worried that the C++0x standard (soon to be C++1x standard if they don't hurry up) will define contrary or conflicting types, just for the sake of being different from C99. There's not a good history of cooperation between C and C++ standards.

    Graf Zahl said:

    GCC supports #pragma pack. ;) I never realized this before but one critical file in ZDoom used #pragma pack without any separate GCC handling - and it works fine. Otherwise no Linux build could work at all!

    I found this out today myself. I don't know if the addition of Microsoft-compatible pragmas is recent or was done a long time ago, but I don't remember them being available the last time I looked into it (this was several years ago admittedly ;) So that's a non-issue now.

    Graf Zahl said:

    This was not aimed at this particular feature. I just see that you add increasingly complex stuff to your C code. This will continue to work for a while, sure. But when I see your document I just ask myself 'Is this all really necessary? Wouldn't it be better to convert the code to a language that's more naturally suited to what you do?'

    I have to admit there's a bit of a deliberate notion in this design that says "look what I can make C do!" ;) Most people just assume this type of flexible structure isn't possible or practical (never mind that JavaScript's original implementation is in C, and that implies this sort of thing being possible automatically). I had to prove to myself that I could do it in a few hundred lines of code, and I accomplished that.

    There is a saturation point, of course, and I do worry about hitting it. However that point also exists in C++ and the object-oriented paradigm. It just occurs in a different fashion and at a higher level of abstraction. All such paradigms have a limit to their scalability. EE is admittedly on the brink of the limit for imperative programming. The existence of the metatable in the first place proves that, I think.

    Share this post


    Link to post
    Quasar said:

    I'm more worried that the C++0x standard (soon to be C++1x standard if they don't hurry up) will define contrary or conflicting types, just for the sake of being different from C99. There's not a good history of cooperation between C and C++ standards.



    That's admittedly a danger. But in the end a global search&replace of all affected type names would be all that'd be needed eventually.

    Share this post


    Link to post

    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now
    Sign in to follow this  
    ×