• Content count

  • Joined

  • Last visited


About DaniJ

  • Rank
    Senior Member
  1. That was a very long time ago, back when Doomsday was using a non-standard method of generating flat geometry using the original BSP data in the map. Nowadays Doomsday uses it's own internal node builder for generating "GL nodes" and is no longer sensitive to partition line selection in the original BSP data.
  2. Afraid I can't offer any more help on what might have gone wrong as far as Risen 3D is concerned. After I fixed the missing " I tried loading that "file" in Doomsday and it was able to parse it just fine. Not presently, no. Expanding map editing features is something that we've not really pursued. We plan to begin looking at this stuff soon, though, once Doomsday version 2.0 has been released. Thanks. This is one aspect we take quite a bit of pride in :)
  3. I decided to try and load your mod in Doomsday and see if anything odd came up. I had to repackage it into a .zip because Doomsday does not currently support .7z Doomsday was able to load your mod just fine (aside for few warnings regarding missing flats/textures (presumably due to Risen 3D doing away with the need for lowres originals)) and reports a total of 39 Map Info definitions when enumerating the total number of DED definitions in the database (the default number for Doom 2 is 33). Also, doomsday.out contained no syntax warnings. In DED syntax version 6, Doomsday made semicolons optional. So, given this fact and assuming Risen 3D only supports the older version 5 syntax, my guess is that you are missing more than one semicolon. Edit: This one https://www.dropbox.com/s/3a7fgh6uicpp5te/DD_DEFNS.txt?dl=0 is correctly formed barring a missing double-quote on line 16*. The other is malformed, in the sense that what you are actually doing is defining a single Map Info definition and then repeatedly changing the ID, Name and Author properties of it. The former *should* work in Risen 3D whereas the latter *might* cause problems. #1654 027.101 (!!!) Def_Read > Def_ReadProcessDED: Error: In "c:/users/danij/documents/doom/rdtextest.ded" on line #16 Syntax error in string value..
  4. Mind posting a link to that concatenated DD_DEFNS lump? I could try running it through Doomsday's DED parser for you to see if it brings to light any further info.
  5. It seems you have placed your DD_DEFNS lumps alongside map data lumps like VERTEXES. As far as I'm aware, Risen 3D (like Doomsday) expects that such definitions are separate to the map data because DD_DEFNS can be used for all manner of different purposes (not just specifying Map Info properties). Have you tried concatenating all your Map Info definitions into a single DD_DEFNS lump in the "root" of the .WAD and seeing if Risen 3D accepts that?
  6. FYI: Although it doesn't help you right now, we are currently working on a new model renderer for Doomsday, that supports newer model formats (like MD5) and shifts much of the burden of transforming and lighting model instances onto the GPU. This method should be significantly faster (magnitudes) once completed.
  7. As it would seem we simply don't agree, by all means carry on with defining your spec.
  8. That's a slippery slope argument. In terms of specifying a solution to the original problem? Ostensibly it is a slippery slope argument, without context. I'm not saying "don't do this because we don't know where it might lead", I am saying "do we even need to do it like this in the first place (and in doing so, inviting otherwise unrelated concerns)?". There seems to be a split focus in this discussion; a) signing data in a released project, and b) managing an edit time integration problem (as originally stated). I think we need to be clear and decide - which is it? I personally think we need to look into separate solutions to these problems. If you guys don't agree at this point, now that all the arguments have been laid bare - I am happy to walk away and leave you guys to it (just say the word).
  9. Believe me, I do understand the concept. I am not intentionally trying to 'derail' the topic. The problem as stated in the OP starts off fairly well-defined. Essentially essel wants (a) tool(s) to assist him in the task of tracking assets and their metadata, through the development process of putting together a new project. From here, things start to veer into less well-defined territory. Rather than simply being an editor/tool based mechanism to support that goal we are now in the realms of defining a common standard for metadata that will exist in final projects released to the wild. (Clearly a solution had already been partially specified given the title of this thread). Please understand that once you start introducing hashes for asserting this relationship in released projects - strictly speaking, you are no longer solving the same problem. This is now veering into the realms of data signing (a topic which I freely admit to having very little knowledge of). All I am really saying is - shouldn't the problem as stated originally be solvable entirely within the editor? If so then we don't need hashes in released projects in order to track the state of assets and their relationship to their metadata because one should be able to assume that they do relate - the project has been finalized and released. If we can't assume this then I agree we need a way to check whether the data has been altered. However, are we saying we need to do this for every lump individually, or can we sign the package itself and on that basis determine that the reset of the metadata/assets have been integrated by the editing tool correctly? If we can't even assume that - it seems to me we need both a common metadata standard and a mechanism to track changes and in which case, is inserting hashes into the common metadata logical (or is there a better way that separates these concerns)? If we are going to define a common standard for metadata - that's one thing. Tracking the state of partially integrated projects during the development process is another, different thing. Checking the integrity of the subcomponents of released packages using embedded hashes is yet another. So let us just examine if what we are purporting to do with a common standard for metadata makes sense for solving the original problem, before diving into specifying a solution for it. If we can logically separate the fundamentally different goals (as I believe we absolutely should) - it simplifies the problem. I am not a fundamentalist radical trying to derail the whole process. I am simply trying to highlight an issue which I believe has been skirted over. The reason I brought ZIP into the equation is because of the revised goals (tracking asset/metadata changes in released projects), leading to defining a common standard for metadata that includes hashes (which from my perspective, concern a relationship that is already known to be good because the final project has been released in this form). A unified standard would be great for this but understand that I want to use the same representation for metadata in ZIP, as well as WAD. Edit+2: Reordered paragraphs for clarity. (2)Expanded on arguments to avoid potentially 'derailing' again.
  10. Look, I'm not on a crusade my point is can we keep the integration hash data out of the package being built until finalized by the author and released.
  11. Ultimately it was only a suggestion. I didn't previously realise the general view, as its never been said. Ok
  12. No that is not the case. I do not principally reject the idea on a fundamental basis. I simply object to the idea of forcing the use of specialist tools to manage metadata which doesn't necessarily require hashing.
  13. In the interest of working together to resolve the situation, can we logically separate the problem presented in the OP from both the idea of a common standard for metadata and the solutions presented thus far? As we have identified, the original problem is a matter of integration. Does this necessarily require embedding hashes in "final products" that will be widely distributed?
  14. No. I presented a solution for associating and delivering a common metadata standard to/with files in a way that maximizes flexibility and which also supports WAD. What you are presenting is something which requires hashes embedded in the data, to achieve the same result.
  15. I know. We disagree on whether this is acceptable. My question was intended to highlight the inconvience incurred, when trying to perform a typical editing task with .zip, should your solution (in it's present form) be chosen. If the intention is to "secure the package by including in it a data signature which can be validated" - I am quite sure we can solve this without forcing every user to jump through hoops in a practical sense by mandating the use of a single, specialist, editing tool. However, this is not something strictly related to the idea of defining a common standard for metadata. It is a separate concern. We can discuss this here in the process I guess. They would not be affected. Metadata could still be included. I am not arguing against the idea, my objections concern how this is achieved.