I'm making an application that does the following:
1) Downloads a list of processes and memory offsets
2) Monitors for those processes. If it finds any, it'll start monitoring those memory offsets and sending the data over to a node.js web server, which'll in turn store the data in a mysql database.
It's for monitoring statistics in mmorpgs. For example, the initial packet from the webserver might be something like this: (in json for simplicity)
javascript code:
[
{
Process: 'lotro',
Memory : {
PvP_Ranking : {
Interval: 30000,
Offsets: [ 0x4592, 0x254, 0xCE ]
},
PvP_Rating : {
Interval: 30000,
Offsets: [ 0x412C, 0x92, 0xF2 ]
}
}
},
{
Process: 'wow',
Memory : {
PvP_Ranking : {
Interval: 30000,
Offsets: [ 0x4412, 0x524, 0xCF ]
},
PvP_Rating : {
Interval: 30000,
Offsets: [ 0x41C, 0x192, 0xF2B, 0x11F ]
}
}
}
]
Every [interval], it'll send that data back up to the node.js web server.
The website will then let you browse your stats over time and generate graphs and make predictions and whatnot.
I'm trying to work out how to best store the data in the database.
My initial thought was just to have a single database like this:
UserId | ProcessName | Character_Name | TimeStamp | Key | Value
------------------------------------------------------------------------------------
1 | lotro | Mikee | 23123113 | PVP_Ranking | 1000
2 | wow | Dave | 23123114 | XP | 12334
But I'm worried that this table could get very big very quickly.
In theory, you'll have maybe 2-3 stats monitored per game, Expecting around 40 people using it, probably each playing about 25 hours per week. Update times around 15 minutes per stat.
3 stats * 40 people * 25 hours * 4 per hour = 12,000 rows per week
This table could probably become a bit of a beast very quickly.
Would there be better ways to store this? a table for each game maybe? the stats might update at different intervals, though..
Hmmmmmm!... ? |