• Visit Rebornbuddy
  • Visit Panda Profiles
  • Visit LLamamMagic
  • Mob DB

    Discussion in 'Archives' started by nET, Jan 16, 2010.

    1. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      Guys,

      Could someone take a look at:

      http://www.buddyforum.de/entry.php?8-CC-Blog

      Its a bit of a long read, but if someone can read, understand and has idea's then I need their opinion on this.

      Should I use a local DB, or setup an online one.

      Basically I don't like having to use MSAccess but local is definite instant connection, online however means an instant community db and it would be faster to query (after connected)... however connection isn't guarenteed.
       
    2. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      btw the only way i could think around the DB pollution was this.

      Two options
      1. You setup a login on the site, you insert and select based of your userid, meaning you have access to only your own data, not sure whether to do this via stiraght DB connections or via php scripts, i think mysql would be faster, but security wise im really not sure how to restrict
      2. You ONLY select of a merged database which is updated once an hour, this is a merge of all the stats off the user databases, this MAY be polluted with fake info
       
    3. fpsware

      fpsware Community Developer

      Joined:
      Jan 15, 2010
      Messages:
      5,287
      Likes Received:
      133
      Trophy Points:
      63
      Sign me up. :) Very nice work. Whats the speed like when reading / writing to the database?

      Can you add another field or two to the table, DoesMobHeal & Health at which it heals. If a mob is likely to heal itself I would save certain spells - silence / interrupts - for such use.
       
    4. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      Local read is fast, like instant but thats in a 3 row DB, but c# ole code is a nightmare, i'm now testing the speed to mysql on a webserver

      And yes, those where the other two lol :)
       
    5. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      Well shit, seems mysql is just as fast and its remote (i.e. everyone can upgrade).... but thats connecting striaght to the DB, just no idea how to secure it down lol, this might end up a closed private project if I continue this route i think.... I cant trust everyone with a possible way to hack this DB

      Code:
      MobId 	UserId 	MobName 	MobKillCount 	MobCast 	MobFlee 	MobFleeHP 	MobAvgFlee 	MobPet 	MobSnare 	MobFear
      	Edit 		2147483647 	0 	Library Guardian 	0 	0 	0 	0 	0 	0 	0 	0
       
    6. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      Added details to a new entry mate if you are interested, base code is there you should be able to start a fork now if you want as the rest isn't that hard

      http://www.buddyforum.de/entry.php?13-Getting-somewhere-slowly

      Might look at SQLite tomorrow as an alternate, would still like some feedback as to which is prefered

      Also i dont suppose you know off hand where to get the NPCId is (i.e. the UID like on WoWhead, guid is the spawn id)... really need the CC Dev forum up again :(
       
      Last edited: Jan 16, 2010
    7. laria

      laria Well-Known Member

      Joined:
      Jan 15, 2010
      Messages:
      5,386
      Likes Received:
      36
      Trophy Points:
      48
      maybe this will help you. It shows how to get the wowhead-id from the ID found in the combatlog.txt which i guess is what youre talking about
      http://www.wowwiki.com/API_UnitGUID
       
    8. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      Was hoping it was coded into the HB Calls as it pulls it out when you click "Mob info"... but i think you make be right and it's making lua calls... i'll look into it more today
       
    9. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      i dont understand sqlite :/

      EDIT:

      Okay figured it out and converted my mysql library to work with same function calls... also it was CurrentTarget.Entry...

      C:\temp\Winamp\Honorbuddy_122>sqlite3 mob.db "select * from MobTable;"
      1|test 1|0|0|0|0|0|0|0|0|0|0
      2|test 2|0|0|0|0|0|0|0|0|0|0
      3|test 3|0|0|0|0|0|0|0|0|0|0
      29724|Library Guardian|0|0|0|0|0|0|0|0|0|0

      [Xibalba 0.06 DB]: Entering MobInDb
      Looking for Mob ID: 29724
      Adding for Mob ID: 29724
      Adding for Mob Name: Library Guardian
      [Xibalba 0.06 DB]: Exiting MobInDb
      .....
      [Xibalba 0.06 DB]: Entering MobInDb
      Looking for Mob ID: 29724
      [Xibalba 0.06 DB]: Mob Exists In DB
      [Xibalba 0.06 DB]: Exiting MobInDb
       
      Last edited: Jan 17, 2010
    10. j0achim

      j0achim New Member

      Joined:
      Jan 15, 2010
      Messages:
      532
      Likes Received:
      15
      Trophy Points:
      0
      This can sure work out great.


      As a PHP MySQL developer i have a idea how you could treat the data.


      What you do is you query mobid on demand, the catch is that you only query it once (per botting session). Once you have queried online database you store the data in a local array. Updating global database could be done every set amount of kills. And were talking small amount of data here so speed should not be an issue.
       
      Last edited: Jan 17, 2010
    11. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      See I thought of that but main problem is this

      You might end up caching 10,000 results into the array depending on how big the DB is, typically you might get around this by ZoneId's so you just cache whats in your zone... however what happens if you are jumping between zones

      I was thinking that the update would be a problem, however your suggestion is a good way around it.

      Now I think i'm going to go down the SQLite route, and create an extra program to output the local DB to a decent format, then push it out via a httpstream to a local PHP script, the PHP script would handle the logic to merge into a MySQL DB (centralised) and hopefully filter out bullshit results, and then another PHP script that converts from MySQL to SQLite that would mean its a updatable Mob DB thats a mix between local and community, it's a nice mixture between both, fixes the bandwidth, speed, caching problem and also allows something that other people can use.

      I've got all the require SQL handling wrote in C#, just need to write the combat logic to call the functions and stop a spam update by only updating a particular guid once etc.

      Taking a bit of tweaking just to get around issues im finding every step of the way, but nearly there :)
       
    12. j0achim

      j0achim New Member

      Joined:
      Jan 15, 2010
      Messages:
      532
      Likes Received:
      15
      Trophy Points:
      0
      IMO you are making this much harder then what you need.


      As for results were talking 1 area most often only have 2 - 3 types of mobs, lets say you go for a very long session and switch area 5 - 10 times you would only end up with a small dataset having 50 ish mobs to handle in an array. Theres is absolutely no need to download all data to a local DB beforehand. And you sum all data so you only have something like this to work with:

      Code:
      db.<mobid>.doesflee = 1/0 //if flee:kill ratio is above a set amount. this ratio could be localy treated as for what class cc is for etc etc...
      db.<mobid>.hasfleed = 1000
      db.<mobid>.killed = 2500
      db.<mobid>.hasmana = 1/0  //we could manaburn.
      db.<mobid>.iscaster = 1/0 //seen casting spells abode a certain kill ratio. this ratio could be localy treated as for what class cc is for etc etc...
      .... we could go on an on about possibilities here.
      

      But having a local sqlite database could work also. As for how to transfer data id say use ODBC driver then connect to the database directly then you wont need to bother having issues with translating from sqlite -> php -> mysql, or mysql -> php -> array/sqlite.


      To filter out data you could just say one row has IP/userid/uniqueidentifier/sessionid if one user is polluting DB with false data just wipe that data.
       
      Last edited: Jan 17, 2010
    13. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      Yea you are on the same page as me, this is all stuff i've looked through over the last two days.

      The thing about zone switching isn't so much how much we have in, its the logic required to know you have switched zones and require a new pull of data, therefore we could end up with stale irrelevant data from the last zone we were in (i'm think a few versions down the line with full 1-80 leveling and question), while there isn't a requirement to load all the data, its more making sure the data we have is useful.

      The concept about sqlite -> php -> mysql was a server side check to filter out and leave the calculations to the webserver rather than put the strain on the CC/plugin/HB, means the PHP does the work and not the C#, i'm being very cautious about putting extra strain on the application itself as it could go horribly wrong.
      Also as its a local copy we can cache only what we need to on moment of pull as its literially a 1ms lookup or something stupid rather than a potential lag on the webserver which causes a 5 sec pull time.

      I could go about this four ways and leave the option in the config for user choice
      1. Option to use local and share data out
      2. Option to use local compiled data, dont share and dont update
      3. Option to use local copy of complied web data
      4. Read striaght from live data (which is riskier), this really depends on how much bandwidth it starts eatting up, webserver speed etc. etc. (using outlinged methods suggested for cached zone data)

      That way it takes in all four, all of which run around the same code anyway so just a matter of an extra few functions and flexiable coding.

      To avoid data pollution I was thinking of what you said, a userid that would be held in two databases, the first database would be the raw data split over userids (i.e. unique key over userid/npcid), meaning you have access to your own info, and as you said if someone is polluting you just need to remove everything across their userid, then an hourly cron would merge all that into a seperate DB meaning when we find corruption we wipe the data of that user then force the merge, this however would all require a user login to be setup which isn't going to be as easy as it sounds without hosting it locally on the HB website and linking it to your HB account (or if they create a script which is basically an "echo $sql[userid]" and I could rip out and cache on startup (as you can't start HB without a session open first).

      I reckon I'm going to have something up and running in a few days, then i'll push out the code for suggestions, switching between local SQLite and MySQL isnt as hard as you think, its just a new namespace called and one or two function changes, so once the core is done if i decide one is better than the other then not much of a overhaul, seems most have been coded around the same ODBC syntax.

      Appreciate the feedback as you've cleared up a few things in my head about switching between the two, the more thrown at me the more I can see advancements (feature creeping) in this, I'm just wondering now if there is any other DB that could be held internally besides mobs, like players, items, quests etc. etc. not sure if they would be any use, but the mob one is useful for additional AI.

      As for MobInfo any other suggestions, thought of a few more like FrostResistant/FireResistant etc. would be handy for Mages so you know you need to alter your Rotation based on whether the attacks would work.
       
      Last edited: Jan 17, 2010
    14. scribbles

      scribbles New Member

      Joined:
      Jan 15, 2010
      Messages:
      61
      Likes Received:
      0
      Trophy Points:
      0
      I feel that the amount of effort you would be putting in to a mob look up is worth the gains!

      Quest Look up, if your willing to make the bot quest, then that would be perfect!

      [quest id="xxx"]

      [questpickup]
      [npcid]xxxx[/npcid]
      [npcname]xxxx[/npcname]
      [location]xx.xx.xx[/location]
      [/questpickup]

      [questgoal]
      [bunchofstuff]xx[/bunchofstuff]
      [/questgoal]

      [questhandin]
      [npcid]xxxx[/npcid]
      [npcname]xxxx[/npcname]
      [location]xx.xx.xx[/location]
      [/questhandin]
      [/quest]
       
      Last edited: Jan 17, 2010
    15. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      I'll give quests a think over on if its possible, would need to be a plugin that overrides the profile i think, and a ppather style automesh, would be a project 20x bigger than this on if not more, however the challenge entrigues me :)

      Ironically this code will benifit other CC's more than mine, rogues, mages, shammies etc. for me it's just flee and casters/heals and alot of pre-emptive code.
       
    16. j0achim

      j0achim New Member

      Joined:
      Jan 15, 2010
      Messages:
      532
      Likes Received:
      15
      Trophy Points:
      0
      You will see and increase performance if you keep all this data in one table.


      With smart use of update and sum we can get all data very quick and we can also update data very quick.
       
    17. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      btw im a DBA ;)
       
    18. j0achim

      j0achim New Member

      Joined:
      Jan 15, 2010
      Messages:
      532
      Likes Received:
      15
      Trophy Points:
      0
      Your having it all under control then ;)
       
    19. nET

      nET Member

      Joined:
      Jan 15, 2010
      Messages:
      77
      Likes Received:
      0
      Trophy Points:
      6
      To some extent... more use to using 7TB databases on LPAR's with 24x P6 CPU using about 400GB of RAM... i've lost perspective on small tables on weaker machines lol.

      Using less advanced software like SQLite and MySQL is killing me as I don't have the same features as on MSSQL, Oracle and even Sybase
       
    20. j0achim

      j0achim New Member

      Joined:
      Jan 15, 2010
      Messages:
      532
      Likes Received:
      15
      Trophy Points:
      0
      My field is small databases with huge amount of simple data where joins are not mentioned.
       

    Share This Page