Quote:
Originally Posted by Ravhin
[You must be logged in to view images. Log in or Register.]
Hey,
First - thanks! Use this thing everytime I'm in EC. A few suggestions:
1) Although the plots are beautiful, have you considered something a bit... nicer? (e.g. http://pchart.sourceforge.net/screenshots.php?ID=2)
2) Things are a bit messy with duplicates, fake things, badly parsed things, etc. Either (a) a master list of items with aliases that get mapped to the item, or (b) a way to link two separate names so that they are known to be the same thing (i.e. "aon" and "amulet of necropotence").
3) Market trends above just item by item. For example, the top 20 most traded things and their price trends with time. Volume/number of trades with time. Some measure of inflation / plat in the economy with time.
4) Few parses things which are handled badly now:
"bracers x2" -> "bracers x" (with some weird price I suspect)
"reed belt - zaharn's coronet - diamond - brigine tunic" (dash as an item separator)
abreviation grouping (i.e. "plat"->"platinum", "rubi"->"rubicite")
"selling" as a WTS tag
thats all for now
|
1) pchart would be fantastic but im not using php if you find something similar in ruby or javascript .. i prefer javascript theres no need to do this on the server
2) i don't have the time to sit down and make a list of abbreviations to items if someone wants to start doing this ill be glad to add it in. if someone can find a link to the item database that p99 uses from eqemu so i can get a list of items names that would be neato too to check to see if its actually an item when i run the parse
3) Ill try to make something simple but calculating trends can be tricky if the data doesn't get updated enough it will most likely be misleading you can't really calculate the volume because you don't know if a sale actually happened this just reads the auction channel spam from your log file
4) yea i know its because the current parser is "good enough" i need a better set of test data to test against im retaining all logs uploaded for the next few days so i can try and build a solid set of test data maybe implement a baysien learning system