Raised This Month: $32 Target: $400
 8% 

Plugin Optimization Help KeyValues


Post New Thread Reply   
 
Thread Tools Display Modes
Author Message
Xaphan
SourceMod Donor
Join Date: Jun 2008
Old 04-24-2010 , 21:25   Plugin Optimization Help KeyValues
Reply With Quote #1

After doing a little research on plugin optimization,
I came upon "Avoid Large KeyValues".

Our user preferences uses KeyValues,
Each user has 9 preferences that they may change during game play.

After checking the preference file we have 6385 unique steamId's, again with 9 preferences each. These are pruned every 30days.
This file size is 1.66 MB.

We use about 750MB of RAM and about 13% CPU Usage over a 24 hour period across 4 active 100tick servers. So the large file doesn't seem to effect us.

So my question is:
Switching the KeyValues over to SQLite or MySQL,
Would it make any difference?
Is there anyway to benchmark a plugin with sourcemod?
__________________
Xaphan is offline
Scone
Senior Member
Join Date: Apr 2010
Location: England
Old 04-25-2010 , 03:48   Re: Plugin Optimization Help KeyValues
Reply With Quote #2

SQL operations have the advantage that they can be threaded, which is important when you're carrying out around tens of thousands of string comparisons.

It is also not necessary to generate and write the entire file (all 54000 records!) to disk when an update is made, while using a sequential KV file means you have to.

I'd definitely recommend SQL over a KV file for this. If you want user preferences to be shared between servers, use MySQL. Otherwise, use SQLite.
__________________
Scone is offline
Xaphan
SourceMod Donor
Join Date: Jun 2008
Old 04-25-2010 , 04:26   Re: Plugin Optimization Help KeyValues
Reply With Quote #3

I didn't think about that,
using MySQL I could then use the same user preference across all servers.

Thanks Scone
__________________
Xaphan is offline
rhelgeby
Veteran Member
Join Date: Oct 2008
Location: 0x4E6F72776179
Old 04-25-2010 , 10:31   Re: Plugin Optimization Help KeyValues
Reply With Quote #4

When using a (external or local) SQL DB I would also cache the players' settings instead of just reading from the DB every time. You can store data in a structure like this:

PHP Code:
enum PlayerSettings
{
    
bool:PlayerSettings_InUse,
    
PlayerSettings_SomeVar,
    
String:PlayerSettings_SomeString[32],
    
Float:PlayerSettings_SomeFloat
}

new 
PlayerSettingCache[MAXPLAYERS 1][PlayerSettings]; 
Then you can cache DB results in structures like these when players connect, and write it back to DB when they leave.

About benchmarking, SourceMod have a profiler, but it's a alpha version. It did work fine when I tested it though: http://wiki.alliedmods.net/SourceMod_Profiler

The profiler will measure the time spent in each event and forward. If you want to measure specific stuff you can use the profiler API in SourceMod: http://docs.sourcemod.net/api/index....oad=file&id=18
__________________
Richard Helgeby

Zombie:Reloaded | PawnUnit | Object Library
(Please don't send private messages for support, they will be ignored. Use the forum.)
rhelgeby is offline
Send a message via MSN to rhelgeby
Xaphan
SourceMod Donor
Join Date: Jun 2008
Old 04-25-2010 , 16:01   Re: Plugin Optimization Help KeyValues
Reply With Quote #5

This seems like what we need to do.

If we cache the players settings like your example, may other plug-ins be able to use the cached settings? or would this just be limited to that plug-in?

I know somethings cannot be cloned, but... can a cached player setting be cloned for other plug-ins to access them?

We are just trying to figure out the best way to handle so much data.

Thanks for the reply rhelgeby, and the links to the profiler, very helpful.
__________________
Xaphan is offline
Scone
Senior Member
Join Date: Apr 2010
Location: England
Old 04-25-2010 , 16:22   Re: Plugin Optimization Help KeyValues
Reply With Quote #6

Caching reads is a good idea, but I would tend to make writes immediate, rather than waiting for the player to leave and then overwriting all their settings. This means that:
- If the plugin/server crashes, no data is lost
- If no settings are changed (the most likely case), no writes are performed
When changing a setting, just write to the database and the cache at the same time, then discard the cache when the player leaves.
__________________
Scone is offline
pRED*
Join Date: Dec 2006
Old 04-26-2010 , 04:59   Re: Plugin Optimization Help KeyValues
Reply With Quote #7

Why not just use clientprefs? It abstracts all the database complexities (including threading), and provides caching as well.
pRED* is offline
Scone
Senior Member
Join Date: Apr 2010
Location: England
Old 04-26-2010 , 05:14   Re: Plugin Optimization Help KeyValues
Reply With Quote #8

Now where's the fun in that?
__________________
Scone is offline
Xaphan
SourceMod Donor
Join Date: Jun 2008
Old 04-26-2010 , 15:59   Re: Plugin Optimization Help KeyValues
Reply With Quote #9

Quote:
Originally Posted by pRED* View Post
Why not just use clientprefs? It abstracts all the database complexities (including threading), and provides caching as well.
I did try clientprefs awhile back, Foreach pref there was a new row. Again we would have around 57,000 rows.

Then I changed to use KV's, but that seems to be slowing.

sm_cookie uses 1 row per setting.

The idea was to minimize this to 1 db and 1 row per user.
Code:
CREATE TABLE `sm_preferences` (
  `auth_id` varchar(64) NOT NULL,
  `a_pref` int(11) DEFAULT '0',
  `b_pref` int(11) DEFAULT '0',
  `c_pref` int(11) DEFAULT '0',
  `d_pref` int(11) DEFAULT '0',
  `e_pref` int(11) DEFAULT '0',
  `f_pref` int(11) DEFAULT '0',
  `g_pref` int(11) DEFAULT '0',
  `h_pref` int(11) DEFAULT '0',
  `timestamp` timestamp NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
  PRIMARY KEY (`auth_id`),
  UNIQUE KEY `auth_id` (`auth_id`)
)
__________________

Last edited by Xaphan; 04-26-2010 at 16:09.
Xaphan is offline
Scone
Senior Member
Join Date: Apr 2010
Location: England
Old 04-26-2010 , 16:16   Re: Plugin Optimization Help KeyValues
Reply With Quote #10

MySQL shouldn't struggle at all with 50,000+ rows. The only time I've run into problems was in a table with 100,000,000 records, and queries still took under 1 minute to complete.

If you're willing to write the extra code though, you might get some slight improvement
__________________
Scone is offline
Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 20:11.


Powered by vBulletin®
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
Theme made by Freecode