Register to post in forums, or Log in to your existing account
 

Post new topic  Reply to topic     Home » Forums » Zugg's Blog Goto page 1, 2  Next
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Thu Nov 03, 2005 4:43 pm   

Windy!
 
Wow, who knew that it could get so windy in Colorado?! The fall winds have finally hit this year...we are getting gusts of 60 MPH this morning, with steady wind at about 30 MPH. I awoke to find broken furniture on the back deck. And it's really hard to concentrate programming with all of the walls and windows rattling.

I know this is nothing like what hurricane victims face, but it was still an unpleasant surprise this morning. I don't mind rain, or snow...but I really hate it when it's this windy.

Anyway, I decided it had been a while since my last blog post. Chiara and I are still doing great with the low-carb lifestyle. We are working out together at the gym twice a week now (Chiara still goes to a women's-only gym call Curves two other times a week). We both still feel great and have lots of energy. I continue to have more ideas about stuff to work on than I have time.

I'm in the middle of the zMUDXP project (no, I'm still not going to reveal the *real* name of this project yet ;). I've got the project plan all laid out and have already started the coding. I'm borrowing source code from zMUD where it makes sense, and writing new stuff from scratch. I'm mainly working on the new settings format right now. The settings are stored in an SQL database to prevent corrupted settings. The trick is combining the reliable database format with some in-memory data structures to improve speed.

According to my project plan, I've got about 50 days of work ahead of me. And that's 10-hour days. So it's not looking like I'll get something out before Christmas as I hoped. As disappointing as this might be to people, I'm the one that is most disappointed because I really needed the money from the new client as soon as possible. But I'm not willing to compromise my high-standards for software and I don't want to rush anything.

My main goal is to release something that is compatible with zMUD, but better in many big ways. I don't just want to release a zMUD clone, and I certainly don't want to release something that is *worse* than zMUD. I want zMUDXP to be "better and faster" than before. The "bionic"-zMUD Wink

I'm going to continue to hold everyone in suspense about the name and other details of the project. It's still possible that I might have some sort of beta ready for the holidays. But like I said, I'll release something when it's ready, and not before. What you can expect to see is an initial beta version with the main core of zMUDXP, but sharing some modules straight from zMUD itself (like the mapper). Then, over the beta period, these zMUD modules will be replaced with new zMUDXP versions. This is the best way to maintain compatibility and yet get something into the hands of customers as soon as possible.

I continue to be amazed at how much zMUD really does. It's easy to sit back and say "I'm going to write a MUD client!" Many people have tried that. And it's really easy to write a *simple* MUD client. But even I have forgotten everything that zMUD is capable of. And making a new client that is better and faster than zMUD is a true challenge.

But I've learned a *lot* over the 10 years that zMUD has been developed, and with the work on zApp last year, my available tools and components have improved as well. So things are still looking really good for zMUDXP and I know people will be really happy with it.

I just need to keep focused on the goal and not worry so much about the time or money aspects. It's really easy to get frustrated with the coding on some days when something I thought should only take an hour ends up taking all day. And I get really frustrated when I spend a whole day fixing bugs in 3rd party components that I've paid a lot of money for. But on other days when the new design comes together, it's all worthwhile. And it's very refreshing to finally do some things that I've wanted to do in zMUD for a long time, but couldn't because of various Windows or compatibility issues.

I'll eventually have a new section in the forums to talk more about zMUDXP and it's progress. But I didn't want to add the new forums too early and get people's anticipation too high yet. So in the meantime I'll post progress reports in this Blog area.

Now, if only this wind would stop!
Reply with quote
Tech
GURU


Joined: 18 Oct 2000
Posts: 2733
Location: Atlanta, USA

PostPosted: Thu Nov 03, 2005 5:32 pm   
 
Hey Zugg... I've been through a few hurricanes and the notorious wind-tunnels of downtown Manhattan so I feel your pain. I think I'd take wind over snow anyday, but then again I am from the Caribbean.

Glad to hear the coding progress is going well, and can't wait to get my hands on the new version.
_________________
Asati di tempari!
Reply with quote
Rainchild
Wizard


Joined: 10 Oct 2000
Posts: 1551
Location: Australia

PostPosted: Thu Nov 03, 2005 8:41 pm   
 
Probably just as well you don't plan to release anything 'til the holidays, because I'm way-swamped at work, and swamped or no, I know I wouldn't be able to resist the urge to trial the new zMUDxp, which would get me even further behind ;)

Tell us one thing... does the new client name still start with 'z'?... I think it'd be hard getting used to pushing Start -> <something_other_than_Z> to launch the new client ;)

I guess we should'a expected it to take more than a month or two, I mean you got hundreds of thousands of lines of code in zMUD, so you got a lotta typing to do in the new product too...

Will zApp be getting upgrades at the same time, like... you'll be writing DNS lookups, and sockets... are they going to be native to zApp, or will it be in the zMUDxp framework ... or are you not writing zMUDxp inside zApp at all?
Reply with quote
Kiasyn
Apprentice


Joined: 05 Dec 2004
Posts: 196
Location: New Zealand

PostPosted: Thu Nov 03, 2005 9:53 pm   
 
Rainchild wrote:
Tell us one thing... does the new client name still start with 'z'?... I think it'd be hard getting used to pushing Start -> <something_other_than_Z> to launch the new client ;)


I have zMUD in the quicklaunch bar! Razz
_________________
Kiasyn
Owner of Legends of Drazon
Coder on Dark Legacy
Check out Talon, an easy IMC connection.
Reply with quote
Tech
GURU


Joined: 18 Oct 2000
Posts: 2733
Location: Atlanta, USA

PostPosted: Thu Nov 03, 2005 10:14 pm   
 
I should do that... Right now it's in my frequent Apps list.
_________________
Asati di tempari!
Reply with quote
Vijilante
SubAdmin


Joined: 18 Nov 2001
Posts: 5182

PostPosted: Fri Nov 04, 2005 1:45 am   
 
*smirk chuckle*

It seems I am the only one left knowing where all the bodies in zScript are buried. There were many things like the "#STW + {something}" syntax that were buried in the help well away from the command reference. I tried to cross reference them, or in some cases just copy the whole text over into the relative command reference section; but still sometimes I missed a few. The only way I know this is because I occasionally reference something on the forums and no one can find it in the help. It iis in there somewhere, and what I mention works; but it is often one of those wierd legacy syntaxes. I wouldn't mind seeing a few of these dissappear; and get a more consistent option list parameter betweent settings commands.

The "#STW +" I mentioned is probably among those that could go. Status window should be made graphically, because they are a graphical display. Definitely feel free to kill some parts of the old zScript, in the new product, to improve the overall consitency of published scripts. Compatibilty needs to take a back seat sometimes. It took MS a number of years to learn this, but the truth is that compatibility with the old format also means that you must include that format's mistakes; some of zScript falls into the same area...

...compatibility is not a reason to repeat those mistakes again.
_________________
The only good questions are the ones we have never answered before.
Search the Forums
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Fri Nov 04, 2005 7:32 pm   
 
Actually, no, the real name for zMUDXP *doesn't* start with a 'z'. It will all make sense when I announce it :) No fair trying to guess and spoiling my surprise though...just keep your ideas to yourself.

As far as zApp, I'm doing some bug fixes that I might be able to release at some point. I've done a bunch of fixes that haven't yet been tested with the themeing, so I can't release a new zApp without more work on that. My guess is that you'll see a new zApp once a month or so.

zMUDXP isn't a "normal" zApp application. I couldn't get the speed that I needed by converting zMUD entirely to a zApp script, and of course it is hard to re-use existing Delphi code in that way. I tried playing with making the Delphi zMUD code into a DLL and calling that from zApp, but ran into other memory sharing issues that also caused slower performance.

So what I've done is essentially merge the zApp code into zMUD itself. This will evolve over time to allow zApp plugins for zMUDXP. But in it's first version, zMUDXP uses the various zApp controls (like the zMemo control is used for the zMUDXP script editor), and zMUDXP should be able to use zApp themes. But there isn't yet any ZML file for zMUDXP that you can edit to change it. This hybrid approach to merging the programs and sharing code seemed to be the fastest way to get zMUDXP released while still working towards a goal of using zApp files for the eventual user interface.

As an example related to Vijilante's comments about #STW, it's possible that you'll see zMUDXP use a ZML file to create user-defined "forms" and status windows. But rather than making zMUDXP a true zApp application, zMUDXP would just use it's internal copy of zApp to create a form based upon a custom ZML specification. It makes zMUDXP a larger program than zMUD, but then again it adds a *lot* of new scripting possibilities. For example, the zApp str, net, math etc libraries are all available via scripting, in addition to the normal %functions from zMUD.

Some of this functionality will be exposed right away, and some of it will be exposed in later versions. I haven't made any real announcements yet because I haven't yet decided exactly what the first beta version will have, and I don't want to promise anything that I can't deliver. So I'm keeping things a bit vague for a while longer. It really stresses me out when I promise a bunch of stuff that I can't deliver in a timely fashion. As I mentioned, the fact that this is all taking a lot longer than I originally planned is already stressful enough. So I need to relax a bit and reduce the functionality of the first beta version and then add stuff over the coming months like I've done with zMUD in the past.
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Fri Nov 04, 2005 7:36 pm   
 
Oh, and yes, Vijilante you are right about the compatibility issue. It's a very fine line though...I need to make zMUDXP compatible enough that people are willing to pay for the new client without making all of their existing scripts invalid. If people have lots of trouble with existing scripts and have to rewrite them, then a lot of people won't bother with the upgrade. If the word gets around that the new client is too much of a pain to use, then not enough people will use it.

So I have to make the new features cool enough to balance the pain of script conversion, and make scripts as compatible as possible, without repeating any of those "mistakes" that you mentioned. Hopefully the incompatibilities will be in obscure areas that won't effect many people. And, of course, the script converter program will hopefully be good enough to flag any problems so that people are aware of what needs additional work.
Reply with quote
Rainchild
Wizard


Joined: 10 Oct 2000
Posts: 1551
Location: Australia

PostPosted: Tue Nov 08, 2005 8:39 pm   
 
You know how current zmud has the 'zmud script, text, msscript' panes... why don't you take that approach and add a 'legacy' pane which runs the script through the old engine... just include it in entirety, so you can emulate the old script as if it was running in zMUD 7.21. Then moving forward, make the zMUDxp script into something similar, but without the mistakes of the past and without the requirements of 100% compatibility. If they don't want to convert their script they can still use it in 'legacy syntax' mode, but if they do, they can take advantage of the new features.

I know you're likely to say 'but I don't want to maintain 2 sets of script engine code', but the reality is you shouldn't need to -- the 7.21 engine should never need to be touched because you don't need to add new XP features to it, it's supposed to be a 'compatibility mode' not a 'use the old syntax with all the new features' mode.
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Wed Nov 09, 2005 4:53 pm   
 
That wouldn't work. The existing zMUD parser is not modular (compared to the syntax-checking yacc parser, which is). Putting the entire old parser into zMUDXP would be like putting *all* of the old zMUD code into zMUDXP, along with all of the existing quirks and problems. It would make zMUDXP a real mess. Getting away from this is one of the main reasons for zMUDXP in the first place.

Basically, if people want 100% compatibility, then they will just stick with zMUD. If they want the new features and improvements, they will deal with the minimal compatibility issues and upgrade.

I understand what you are suggesting, and if I can make the old parser modular, then I might do that. But right now it doesn't look likely.
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Sat Nov 12, 2005 10:21 pm   
 
I'm really getting depressed about how long this is all taking. On the project plan that I updated 10 working days ago, I had an estimated 50 days of work left to do. In those 10 days, I've only accomplished 3 of the 50. That means it's taking roughly three-times longer than my plan. And the stuff I'm working on right now (settings editor, database settings, etc) was supposed to be one of the easier parts.

It reminds me of doing web-page design in a way...all of the time is taken by the "tweaking". The problem is that zMUDXP is all about tweaking. It's one of the most advanced user interface designs that I've ever worked on and there are lots of details to worry about. And the sad part is that in many ways, these details are invisible to the user. The user interface is just supposed to work naturally like you would expect. That means paying attention to things like exactly which control remains focused after doing a database filter, paying attention to screen updates and performance, etc, etc.

Converting the settings from a "raw" in-memory linked list into a real database has been a much bigger change than I realized. I've had to rewrite the entire settings editor from scratch. Very little code from zMUD is getting reused. And the zMUD settings editor was already pretty complex. Simplifying the settings editor while maintaining the power is a tricky job.

I know it will all work out in the end, and I know I shouldn't worry so much about how long this takes. Maybe I just need to go eat some protein or something. But I haven't had what I'd call a "smooth" programming day in a couple of weeks. It's really hard to get into the "zone" like I did with zApp last year.

We haven't made any changes to our low-carb living, so I can't blame it on bad food. Maybe I'm just working to hard and getting burned out. Of course, it's hard *not* to work all of the time with that "50-days" (now 47 days) looming ahead.

Oh well, sorry I had to vent again. Oh yeah...it's also windy again today.
Reply with quote
Taz
GURU


Joined: 28 Sep 2000
Posts: 1395
Location: United Kingdom

PostPosted: Tue Nov 15, 2005 4:09 pm   
 
Is it the low carb causing the wind ;)

*hugs* Zugg, don't push yourself too hard.
_________________
Taz :)
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Tue Nov 22, 2005 2:15 am   
 
Heh, no, actually low carb seems to cause less "wind" ;)

Well, today was another totally frustrating day. I'm really getting fed up with all of the bugs in these 3rd party components. I had thought that the Developer Express controls would be much better than they have turned out to be. Everything works fine for "simple" applications, but as soon as I start to do something complicated, I run into all sort of problems.

My sample of 3rd party bugs that I had to deal with today:

1) Using a MemTable component for an in-memory database (dataset), there is a CopyRecord procedure. But if I try to actually *use* this procedure to copy a record, it screws up the database because it copies *ALL* fields, including the unique id key field, and then Posts the data, resulting in a duplicate record. There is no way to tell it to skip this field without making it an AutoInc field, and no way to tell the routine not to Post the data. I ended up having to write my own CopyRecord routine that skips the primary index field. Basically, the 3rd party CopyRecords routine is useless and obviously wasn't tested in any "real" applications.

2) Using the DBTreeList component from DevExpress, there is an option called SmartRefresh that improves performance by only updating nodes in the tree that have changed. Well, this option is fine except that it doesn't handle record Deletion. And there seems to be no good way to update the tree when a record is deleted from the underlying database without turning this option off. Again, not tested for "real" applications...seems that they didn't plan for anyone wanting to delete records from the database...stupid.

3) The DBTreeList has a RootValue property to determine which nodes are displayed at the root of the tree. But this property doesn't seem to work at all. I wanted to display a "subtree" and tried setting RootValue to the ID of the top node of the subtree I wanted, and nothing happens. Yet another "feature" that wasn't tested.

4) In general, the DBTreeList doesn't update itself properly, even with SmartRefresh turned off. I have to add Tree.FullRefresh calls all over the place to get the tree to update. For example, when I load a new settings file into the editor, the tree doesn't update until I call FullRefresh. You'd think that it could tell that the underlying database has changed. Very sloppy.

5) Back to the in-memory data tables, there is a nifty way using the kbmMemTable component to set up multiple "cursors" that all point to the same underlying data. Each cursor dataset points to the same physical data, but can have different cursor positions, filters, and indexes. They have a nice feature called "FilteredIndex" that is a combination of a filter and a fast index. For example, I have have the full settings in a MemSettings dataset, then have a ClassSettings which points to the MemSettings physical records but has a filter like "SettingKind = ClassFolder" so that it only contains the class folder hierarchy. Well, this works fine until you update the MemSettings physical dataset. With the AutoRefresh flag on, it works fine. But this makes loading a new settings file really really slow. So I turn off the AutoRefresh for the "attached" ClassSettings dataset, load the new settings, then turn AutoRefresh back on. Then I call the dataset.Refresh method to update the ClassSettings. It updates the data, but doesn't apply the filter. So now ClassSettings has *all* of the records again, not just the class folders. The only workaround seems to be deleting the entire index and recreating it from scratch, which is also slow, but not as slow as the AutoRefresh option.

I've sent out bug reports to the various vendors on all of these issues, but I shouldn't have to do this. All of these controls have been around for years and should have been better debugged by now. And it takes hours and hours to look through their source code and try to figure it out myself. Unless a bug is pretty clear, I don't like messing up 3rd party source code...makes it a pain to upgrade. Although, with that said, my "changes" file for the DevExpress components now has about 30 modifications that I have to make each time I update their controls to fix various issues.

I really hate working on other people's code. Sometimes I really wonder if these 3rd party components are worth the hassle. In some ways they make zMUDXP a lot better, but they are driving me crazy with all of their little quirks.

I keep thinking that I'll get past these issues, but it seems like every day I run into new problems like this. I'm spending more than half my programming time dealing with stuff like this, and it's VERY frustrating. It's making this whole project a LOT less fun that I was hoping it would be. I was looking forward to getting away from all of the kludged code in zMUD, but now it seems that I'm having to kludge just as much stuff in zMUDXP.

No wonder most software doesn't work.
Reply with quote
Rainchild
Wizard


Joined: 10 Oct 2000
Posts: 1551
Location: Australia

PostPosted: Tue Nov 22, 2005 8:43 pm   
 
I have never had much joy with database components of any sort... even the ones that come with delphi, .net, vb... I just don't think anyone quite knows what the controls need to do, especially when dealing with multiple computers accessing the same data. I guess you don't have to worry about that for zMUDxp, but I get it all the time... if person "A" and "B" open the same record, "A" changes one thing, hits save, "B" changes a different thing, hits save... "B"'s overwrites "A"'s changes... in the end I just write my own.
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Tue Nov 22, 2005 11:14 pm   
 
Actually, that still happens to some extent, even within zMUDxp. Not so much data "changes", but the current database record is always scrolling to something else.

For example, imagine a database-driven settings editor (like the one in zMUDxp). If you think of the old settings editor, you click on a setting (like a trigger), and it shows the details in the edit panel. In the database case, you have selected the current record by clicking on the trigger, and the editor shows the details of the current database record in the edit panel.

OK, simple enough...now add a TreeView to show the class structure and show the settings within their classes. Now you can click on a node in the tree to set the current database record, which then shows the details in the edit panel. No problem, except that *every* time a change is made to the database, the TreeView needs to refresh itself, which involves looping through the database. So adding a Tree component totally messes stuff up because it's constantly changing the current database record internally for it's own housekeeping.

You end up having to play lots of games with enabling and disabling controls, and setting "OnUpdate" flags to determine whether or not you should display the edit panel. Obviously, if you don't do anything, then the edit panel will be constantly updating and flickering, making everything slow.

So, even though there are not multiple changes being made, there are still complications when multiple components are using the same database. This is why I was so excited to find the feature in the MemTable where you could have multiple cursors within the same physical dataset. I can assign a different cursor to the TreeView and let it do whatever it wants without disturbing the main cursor used to show the current record in the editor.

I got a note back from DevExpress about the issues I raised. The issue with RootValue is that it's ignored in the particular "mode" that I'm using the tree. So I'll have to give up my ideas about using it. The issue with it not synchonizing Deletions when SmartRefresh is enabled is a "feature" but they showed me some code examples on how to fix this. Basically, instead of deleting a database record directly, I need to tell the TreeView to delete the record. Turns out I needed to do this anyway to implement stuff like confirmation warnings for deleting everything in a class. The nice thing is that the multi-selection deletion and drag/drop works like a charm and was easier than I expected.
Reply with quote
Vijilante
SubAdmin


Joined: 18 Nov 2001
Posts: 5182

PostPosted: Sat Nov 26, 2005 2:44 am   
 
Sounds like they need a documentation guy. I can never get understand why this particular category of guy is always lost on every project; guess that is why I never really made it as a programmer.
_________________
The only good questions are the ones we have never answered before.
Search the Forums
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Tue Nov 29, 2005 3:30 am   
 
Another day in the life of Zugg the programmer:

Today I got to deal with both a bug in DevExpress, as well as a *really* nasty bug in Delphi 7, as well as an Windows issue...

1) DevExpress has docking panels, which is one of the big reasons I bought it. I'm using it to replace the old, buggy docking code in zMUD that I kludged in Delphi 5 (which no longer works in Delphi 7). Well, when a panel is floating, there is an X button for closing the window. Even though I have the FreeOnClose option for the Docking Manager set to false, when I click the X button and then try "DockPanel.Visible := true" later to redisplay the panel, it never gets redisplayed. It's still in memory, but it can't be displayed. Emailed DevExpress on this problem.

1a) The DevExpress Toolbar system has all sorts of fancy options...but no way to make a toolbar transparent. WinXP toolbars are fine when they are at the top of the window. But when I put a toolbar in the middle of a form, and make it vertical, to hold some toolbar buttons, the WinXP gradient looks horrible against the background. Fortunately I have the DevExpress source code and was able to add a "Transparent" option for the toolbar dock. Another 2 hours gone, but at least it worked.

2) This was the nastiest problem of the day. Dephi has something called "Frames". Frames are essentially a group of other components that you can reuse (like a Form). So multiple forms can share a particular Frame, which allows you to do a lot of nice modular things with your code. So far so good. Now, I want all of my frames to inherit from a parent class. OK, it's a bit of a pain in the Delphi IDE to do that (you can't just select New>Frame, you have to select New>Other, then select your Application tab, then select the parent frame you want to inherit from). And if you get Delphi confused, it can forget that the new frame is inherited...it changes the "inherited" keyword in the form DFM file back to the normal "object" keyword and then will have all sorts of problems until you edit it back using the text version of the DFM file.

Have your eyes glazed over yet? I haven't even gotten to the bug! OK, inherited Frames seem to work. Now, we have our "Parent" frame, and our "Child" frame that inherits from it. But now, what if we want to use some other frame in our Child. Well, if the Child Frame contains a reference to another Frame, and you have the ParentBackground option set to true (for WinXP theme support), then all TGraphicControl descendants (like a simple TLabel) on both the Parent and Child frame will no longer be drawn! It took 2 hours just to pin down these conditions. Fortunately, I found an obscure newsgroup posting about this from 2003 ( http://groups.google.com/group/borland.public.delphi.non-technical/browse_thread/thread/1fdce2d3114080d7/462eabe46869a79e?q=delphi+frame+parentbackground for those who are interested). Turns out this is a reported bug in Delphi 7 (QC#3850) from 2003 that Borland decided never to bother fixing in the 7.1 update that was released last year. I'm getting REALLY TIRED of companies like Borland who decide to jump on the .NET bandwagon just so they can avoid fixing all of the bugs in their existing Win32 code (yeah, so we get NEW bugs in their .NET code).

Anyway, thank goodness for that newsgroup post. The workaround was *really* obscure, and I would have never thought of it by myself. Well, about 4 hours down the drain on this one.

3) The last one was a Windows issue. In Windows, there is a message WM_SETREDRAW that can be used to stop Windows from updating/painting a control. This is really useful to prevent flickering when you make lots of changes to a form. I use this in the Settings editor when you move from one setting editor type to another (like moving from an alias to a trigger where the edit panel needs to be changed a lot). Works great except that when you turn the redraw flag back on, the control needs to be repainted.

Well, I tried calling Invalidate to tell the control to repaint itself...no good. I called the Delphi Refresh or Update routines...no good. I called the Delphi RePaint method...still no good. The control just wouldn't repaint itself. And yet, if I clicked on another window that obscured my program, and then clicked on my program again, the window would be fine. So repainting the window does work somehow...but not when I try calling it in the code. Finally I found yet another newsgroup post about the WM_SETREDRAW message from someone having the same problem as me. The solution was to call the low-level Windows RedrawWindow routine with the "RDW_ERASE or RDW_INVALIDATE or RDW_ERASENOW or RDW_UPDATENOW or RDW_ALLCHILDREN" flags. Yep, really obvious, right?

Sigh...that was my programming day today. Once again, dealing with issues with other people's code. I'd really like to spend the time on MY CODE for a change. And the lack of support for bugs in Delphi is really starting to drive me crazy. Unfortunately I don't have a choice...ALL of my code is in Delphi and most all of my programming experience is in Delphi/ObjectPascal. And I'm sure that Microsoft tools like Visual Studio have plenty of issues of their own, and Microsoft isn't doing much to support non-.NET versions of Studio than Borland is with Delphi. I'm sure I'm not alone in this frustration.

At least it's not windy today!
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Tue Nov 29, 2005 11:28 pm   
 
Well, (1) turned out to be my problem...well, the problem of the ThemeEngine crap. In order to make the DevExpress Docking system compatible with the ThemeEngine stuff used in zApp, I had to change their DockPanel so that when it creates a floating window for the panel that the floating window has the proper caption and frame for the current theme. This means putting a ThemeEngine teForm control on the floating window.

Well, the ThemeEngine crap (I think I'll make this it's official name), intercepts the WM_CLOSE event when clicking the X button on the window to close it, and then doesn't properly notify it's owner window that it is being closed. So the floating window was being closed without the Docking engine from being notified. So the docking engine thought the panel was still visible, and when I tried to display it using the "Visible := true" code, it was ignoring me because the Visible property was already true.

Once I fixed the ThemeEngine crap so that the teForm control properly notifies it's owner that it is being closed, then it started working again.

But this uncovered another problem that really bugged me. And it has caused me to wonder...AM I THE ONLY ONE WHO CARES ABOUT QUALITY?????

OK, here's the new problem. A docking window uses something in Windows called a TOOLWINDOW style. This causes the caption bar along the top to be smaller than a normal window. A normal window has a caption that is 34 pixels high, whereas a Tool Window is only 26 pixels (at least in my XP theme). You will see this if you undock any toolbar in any program...notice that the caption on the floating window is smaller?

OK, fine, so what's the problem? Try resizing one of these Tool Windows to make it as small as possible. Imagine that you only want to display the caption and none of the window? Notice that it's impossible?

That's because MICROSOFT decided that the minimum size for ANY window was the height of the normal window caption. So when you resize a Tool Window, you still see a sliver of the client window and can't make the window any smaller. Idiots!

Why do I care about this? Well, remember the RollUp button in the zMUD caption bar (next to the StayOnTop and Min/Max buttons)? This button causes the window to shrink so that only the caption bar is visible. The ThemeEngine crap has some good support for Rollup and StayOnTop buttons in the theme. So instead of zMUD, where these buttons were always drawn for the default XP theme and ignored your own theme, and interfered with stuff like WindowBlinds, in zMUDXP using ThemeEngine, these buttons now respect the current theme and you can customize the look and feel of the buttons.

But when you use the Rollup function on a ToolWindow, it can only roll up to the minimum height that MICROSOFT enforces, so you still see part of the window instead of just the small caption.

Yes, I CARE ABOUT THESE DETAILS? Am I the only one? I want my Rollup function to WORK!

I found several newsgroups posts about this problem, and people have it even in Visual Studio, so I don't think it's a Delphi problem. Unfortunately, none of the posts got any useful replies on how to fix it. Apparently nobody cares about picky stuff like this. I tried intercepting the WM_GETMINMAXINFO event to tell Windows that it was ok to have a really small window size, but it gets overridden somewhere else in Windows. I tried intercepting the WM_WINDOWPOSCHANGING event and here is where I can see it setting the minimum size, changing it to 34 in my case.

So, after many hours of messing with this (several hours just to determine whether the problem was with Delphi, ThemeEngine, DevExpress, or Microsoft) I finally totally kludged this routine to make it work. If it detects a window size of 34 (well, not hardcoded like that, but equal to the current caption height) and it's in a ToolWindow, then it forces the height to be the correct height of the toolbar caption size. So now when you resize a ToolWindow, when you get down to the end, it sort of "snaps" into place and makes the window the correct height so that you just see the caption bar. This also makes the Rollup function work.

If anyone knows of a better way around this, or really understands what's going on here and what Windows is doing, please let me know. It's really been driving me crazy.

I'd REALLY REALLY REALLY like to start spending my time on my OWN code for a change. I'm getting SICK and TIRED of messing with bugs in other people's code. Maybe I'm crazy for caring about details like this. I know I'm a perfectionist. Some people think that's a bad thing (there are lots of books on it). But I think that if there were more perfectionists around, then maybe some of this stuff that we all spend our hard-earned money on would actually work. It's no wonder that so many programs don't work correctly when you can't even trust the operating system or development tools. They just ship what they have and figure that it's not worth their time or money to fix small bugs like this because by the time someone finds out about them, they'll be working on a new operating system and won't need to support the old one anymore.

Yeah, I'm probably as guilty as anyone with some of this (ignoring some obscure bugs in zMUD and just releasing zMUDXP instead). But I'm just writing a MUD client...not an Operating System or Development Tool that is going to be used by millions of people. But I still care...I care a lot. Am I the only one who cares about quality anymore?
Reply with quote
Taz
GURU


Joined: 28 Sep 2000
Posts: 1395
Location: United Kingdom

PostPosted: Wed Nov 30, 2005 4:48 pm   
 
No, you and every other perfectionist, including me Twisted Evil
_________________
Taz :)
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Mon Dec 05, 2005 9:24 pm   
 
I think I'm starting to agree with RainChild that using Database-bound controls is just a really bad idea. Everyday is a struggle with this stuff.

Todays adventure: Adding new records!

OK, so imagine a tree-view of your settings: triggers and aliases within classes, which might be within other classes, etc.

Now, you want to add a new alias to the current folder. The data-aware tree view creates the tree structure based upon "parent" ID values. In the case of aliases within a folder, each alias has a "Parent" field whose value is the ID field of the class that it is contained in.

So, to add a new alias to the current class folder, you do something like this:
Code:
Dataset.Append
Dataset.FieldByName("parent") = ClassID


Seems simple enough. We don't want to "Post" the record yet, because the user might not enter any value for the node and if they just click the New button, then click somewhere else in the tree without filling in any information, we want to call "Dataset.Cancel" to cancel the adding of the new node.

Well, unfortunately with the DevExpress DBTreeView, the above code only works some of the time. If the current class is expanded in the tree and already has items in it, then it works fine. But if you are adding the first item to a class folder then you get Access Violations all over the place. Seems that the DevExpress code wants to first "Expand" the current folder (even though it doesn't have anything in it yet). And in the process of trying to expand the folder, it calls something called "CancelOperation" which cancels the current database operation. Cancelling the operation cancels our "Append" action, so now the new record no longer exists, and when the DevExpress tree tries to display the new child node, it gives and Access Violation because the database record no longer exists.

The only way to get the tree to now try and update itself when adding a new record is by calling the "DataSet.DisableControls" method. This tells all controls attached to the database not to update until a corresponding "DataSet.EnableControls" is given later. Unfortunately, this also cancels any pending dataset operations.

So there is no way to append a record and leave it in edit mode in case the user changes his mind.

The only way to do it is with the following code:
Code:
Dataset.DisableControls
Dataset.Append
Dataset.FieldByName("parent") = ClassID
Dataset.Post
Dataset.EnableControls
Dataset.Edit

that appends a new record, causes the tree view to update (when EnableControls is called), then puts us back into Edit mode so that the user can immediately start entering data into the record fields.

But now we have a "blank" record stored in the database. So when the user clicks somewhere else in the tree before making any changes, the blank record stays behind.

So now we have to kludge the whole thing by adding an event handler to the "BeforeScroll" event for the dataset. The dataset is "scrolled" whenever a new record is selected. In the BeforeScroll event, we can check to see if the current record is blank and if so, delete it before scrolling to the new record.

We, this added a new problem. When a dataset is opened, a routine called "Dataset.First" is called to move to the first record of the dataset. If the dataset is empty, then our current record is blank and our BeforeScroll event tries to delete it. But since there isn't anything in the dataset yet, it gives an error.

OK, so we add a condition to our BeforeScroll routine so that it only deletes if the "Dataset.RecordCount > 0". However, the stupid 3rd party database drivers return a RecordCount of 1, even for an empty dataset!! OK, so we change it to "Dataset.RecordCount > 1" and ignore the fact that this will prevent it from deleting a blank setting if it's the only thing in the database.

All of that ... just to add a new setting to the database and work correctly if the user clicks on something else before entering data into the new record. What a mess.

But that's the problem with databases. The whole reason I'm using databases in zMUDXP is for data integrity. I want to eliminate the problems that cause "corrupted settings files". Sure, a database can still get corrupted, but it's a lot harder and rare. Most databases are designed with the fundamental goal *not* to corrupt the entire database. And even if a table gets corrupted, it's usually easy to repair without losing the entire database.

But data integrity comes with a price. And we see an example of it here. When you Append a new record to the database, you can't just leave it hanging there without being saved to see if the user clicks on some other record. The new record is either IN the database, or it's not. With the corrected code, a blank record is immediately added to the database so that there is a record to edit. Then if something happens, like the power goes off, the record is still stored in the database.

I should have realized that some of the stuff I used to do in zMUD wouldn't work the same with the new database. And these are the kinds of issues that are talking most of the time to get working.
Reply with quote
Rainchild
Wizard


Joined: 10 Oct 2000
Posts: 1551
Location: Australia

PostPosted: Tue Dec 06, 2005 9:26 pm   
 
The way I do my database stuff (and it's fairly automatic, almost as good as a data aware control) is I wrote a custom SQL-enabled panel (when I ported it to .NET I used a form because I never needed just a portion of a form to be 'data-enabled' so it made sense to do the whole form)... the code behind it involved a little bit of type checking and casting and I haven't added support for swapping between multiple tables, one form = one base table (but you can support child tables as listviews).

What I do is this:
- select the record from the database
- scroll recursinvely thru all child controls on the panel/form
- if the child control is named db_xxxxx (where xxxxx matches a column name), populate it with the value from the database
- save that populated value to an 'original values' stringlist so I can tell what has been modified later
- let the user manipulate the panel/form until they call the 'save' or 'cancel' event
- if they choose save, loop through the original values list and find the new value from the child control
- if the value matches then it hasn't changed and ignore it, if it has changed add it to a sql statement
- finally, if there is a sql statement to execute (eg if 1 value or more has changed) then execute it in a try { } block in case there's any errors (like violations of foreign key, unique constraint, etc)
- tada, done ...

The main tricky bit is the type checking of the controls and data formats... for example:
- to do the recursinve refresh you need to loop through the Frame->Controls[ xx ]; list (use Frame->ControlCount to find out how many children it's got)
- then if the child control->ClassType( ) == the __classid( ) of TPageControl or TPanel or other such container controls you need to recursively go in deeper
- if the control's name doesn't start with db_ then you can continue to the next one
- if the control->ClassType( ) == the __classid( ) of TListView or I guess TTreeView then you need to do special processing to populate those
- if it's a TEdit then you need to grab the data and convert it to a string
- if it's a TComboBox then you may need to populate the selectable items with the foreign keys/values on a separate table
- if it's a TCheckBox then you need to make sure the FieldDef->DataType is ftBoolean and check it or not as necessary
- if it's a TDateTimePicker then you need to make sure the field is a date time and populate it (though you may not need to store datetime's in a project like zmud).
- etc

I'm happy to give you C++ Builder snippits if you want 'em, but I use BDE and SQL Server so it might be more porting than it's worth.
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Wed Dec 07, 2005 6:19 pm   
 
Hey, RainChild, that's really funny!! That's almost exactly what I'm doing right now too! I still use some data-aware controls, but in the various "preferences" pages, I have a routine called UpdateDBForm that loops through the controls on the form looking for "dbxxxxx" fields and doing just what you mentioned. Right now it's pretty simple since I'm only doing it for Boolean fields (checkboxes), numeric fields (spinner controls), and text fields. Haven't needed DataTime fields like you mentioned, and haven't run into comboboxes yet (although probably will).

In fact, for the Preferences pages in zMUDXP, I am now mostly autogenerating those forms from the database itself. So instead of 30 different forms, I can do it all in code from a data-driven list of preference variables. This has the nice side effect of allowing people to change any preference in zMUDXP within a script.

Anyway, I just thought it was pretty funny that we are doing almost the same thing. Great minds think alike, and all that!
Reply with quote
Zugg
MASTER


Joined: 25 Sep 2000
Posts: 23379
Location: Colorado, USA

PostPosted: Sat Dec 10, 2005 11:32 pm   
 
I should probably change the name of this blog topic to "CMUD Progress". Oh well, I'll just let it be our secret for now.

Today was a good day. I got something really complicated working.

The difficult issue that I worked on today is dealing with multiple "package" files in CMUD. You see, each package is stored in it's own database file (SQLite database format, or XML format). For example, you might have something like GLOBAL.PKG to load the global settings, then MYTRIGGERS.PKG to load your own triggers. In this simple case, each PKG file really corresponds to an old *.MUD settings file.

Now, in CMUD, both of these packages get loaded into memory. In the Package Editor, I want you to be able to edit any setting in any package in memory, so each package becomes a node in a big hierarchical settings tree. There is a node called "Global" and a node called "MyTriggers" and when you expand each of these nodes in the tree you see the individual aliases, triggers, etc within that package.

OK, that's how I want it to work. But let's look at the underlying complexity of this. First, each PKG file is an SQL database. Now, in CMUD we need to show a hierarchical tree view of *all* packages in memory. This implies that all packages in memory need to be in a single database or single data structure. The TreeView just treats every node in the tree the same...there is no way for it to know that some nodes come from one package, and other nodes come from another package. So for the TreeView, every setting needs to be coming from the same place.

That means that the database within the CMUD memory is actually a collection of packages. Fortunately, my database structure for packages already allows multiple "subpackages" within a package. In fact, it handles a package just like a class folder, so you can have as many packages nested within each other as you want.

So, I can use my existing database structure without any problem. Now we need to load each individual package file into the in-memory database table. Here is the first complexity: the first setting (trigger, alias, etc) in the first package has a record ID of zero. The first setting in the second package also has an ID of zero. So when you load both packages into memory, you have duplicate ID fields. So during the load process you need to merge the new package into the in-memory table and adjust all of the ID fields so that each record is unique.

What I decided to do here was use the upper 12 bits of the ID field as the package ID. This gives 20 bits for the actual setting ID, which allows over one-million settings per package file. Using 12 bits for the package ID allows 2048 packages to be loaded into memory. These seem like reasonable limits. So as I load a package into memory, I "patch" the upper 12 bits of each ID field with the package index number. I also save the link to the SQL database file for that package for later.

OK, this works and lets me load multiple packages and merge them in memory. Now what about updates?

The way updates are handled in CMUD is using SQL batch transactions. You don't want to write an update to the disk file everytime something (like a variable) gets changed in memory. So CMUD accumulates the changes being made to the database, and then sends a batch of updates to the SQL database at various times. For example, when using the Package Editor, each setting is updated to the database whenever you move to the next setting, so the disk file is almost always fully up to date. But when you exit the Package Editor and CMUD is just running, then the package file is only updated at an interval that you can set (like every 5 minutes). This speeds up changing CMUD variables (#VAR command) since it doesn't have to write to the disk file. I have other improvements planned for this, but let's ignore the other optimizations for now.

OK, back to the Package Editor. Imagine that I have multiple packages loaded and I've made multiple changes to multiple packages. Now when I want to apply the batch of update transactions from the in-memory database table I need to figure out which SQL database file the package was originally loaded from, and make sure the update gets applied to the correct file. For example, if I update the Global toolbar package, I want the changes applied to the GLOBAL.PKG file that was originally loaded.

This is why I saved a reference to the original database file when the package was loaded. By looking in the upper 12 bits of the record ID for the record being updated, I can determine which package it is and make sure I apply the SQL UPDATE command to that database.

But wait...the record ID values in the in-memory table are no longer the same as the ID values in the original SQL database. So before applying the update, I also have to make sure I strip off the top 12 bits so that the ID value written to the database no longer has the package information in it.

Yeah!! It all works! Each individual PKG database file is a standalone package, but I can load multiple packages into the Package Editor and apply multiple changes to multiple packages and then in a single save step I can update all of the correct database files. Works just like it should and all of the complexity is hidden from the user.

I have to give some kudos to one of the 3rd party components I'm using. I'm using something called "kbmMemTable" which is an in-memory TDataSet for Delphi. This component has some really excellent hooks and methods for loading and merging data from other datasets (like SQLite datasets), and a nice "Updater" class that helps apply any updates made to the in-memory table back to their original SQL databases.

Using this MemTable, I can load an SQL database into memory, then do all of my manipulations in memory where it's really fast, then batch the updates back to the SQL file. This is great when you want to store your data in SQL format, but you don't want all of the overhead of the SQL database in memory all of the time.

MemTable also has a really nifty concept called a "Filtered Index". It's a special database index that also "filters" the data in the dataset. Switching between these filtered indexes is really fast, and it's a way to mimic SQL queries but with much greater performance. You can also attach multiple MemTable datasets to a single physical store of the data, allowing multiple cursors into the same data.

All of this flexibility has really helped solve a lot of the problems you typically have dealing with SQL databases in a non client-server application like CMUD. It's really going to help a *lot* when I eventually convert the Mapper into SQLite format and get away from all of the problems and overhead of ADO.

Anyway, that's another insider look into the life of Zugg and the ongoing development of CMUD.
Reply with quote
MattLofton
GURU


Joined: 23 Dec 2000
Posts: 4834
Location: USA

PostPosted: Sun Dec 11, 2005 5:15 am   
 
That's pretty cool, but I do have one question still. When you switch over the mapper, CMUD's not going to melt my computer as it blazes away, is it? Laughing
_________________
EDIT: I didn't like my old signature
Reply with quote
Rainchild
Wizard


Joined: 10 Oct 2000
Posts: 1551
Location: Australia

PostPosted: Sun Dec 11, 2005 8:37 pm   
 
I may take the lazy-programmers way to uniqueness, but in my line of work where there can be people on WAN/CDMA and batch links (rather than connected via local LAN/WLAN) I find it much more useful to use a GUID. Yeah, it's 4x the size of the field you are using, but the guaranteed uniqueness is headache free. There's downsides, of course, because every ID is 4x the size, it can slow down queries marginally due to having larger indexes.

Two things you should really consider with the batch writing to the DB is ...
1) use a lower priority thread to do it in the background so that you don't lag output while doing it
2) only write one package at a time, rather than writing every package at the 5 minute tick, write one package every 10 seconds on a revolving counter. Or rather, 5 mins = 300 seconds, if you have 20 packages loaded you should write one packet every 300/20=15 seconds. That works up to about 600 packages (one per half second is probably fine as a background thread, I don't know what SQL Lite benchmarks like, but with the main engines I've used in the past you're unlucky if your update takes more than about 5ms. Beyond 600 packages, you might need to start doing updates every 2-4 seconds with multiple packages written at once.

As for memory use, I think you should really start scaling things up, for example I have 2 gigs in my main system, and 1 gig in my laptop and web-browser-pc, so if there was some kinda 'use this much RAM for mapper cache' or whatever, that would be really cool. Have the entire map loaded into the mem tables, and cache updates on it every 5 seconds.
Reply with quote
Display posts from previous:   
Post new topic   Reply to topic     Home » Forums » Zugg's Blog All times are GMT
Goto page 1, 2  Next
Page 1 of 2

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
© 2009 Zugg Software. Hosted on Wolfpaw.net