load testing a database

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

load testing a database

Tudor Girba-2
Hi,

Do you happen to know methods to approach the problem of testing the
capacity of an application to work with an SQL database?

In particular, I am interested in simulating concurrent requests
towards this database that would resemble the interaction coming from
the application.

Cheers,
Doru

--
www.tudorgirba.com

"Every thing has its own flow"

Reply | Threaded
Open this post in threaded view
|

Re: [Moose-dev] load testing a database

Ben Coman
Tudor Girba wrote:

> Hi,
>
> Do you happen to know methods to approach the problem of testing the
> capacity of an application to work with an SQL database?
>
> In particular, I am interested in simulating concurrent requests
> towards this database that would resemble the interaction coming from
> the application.
>
> Cheers,
> Doru
>
>  
No direct experience - just musing...
I assume it is not a web application or you would just use any web
stress-check tool - but thinking laterally perhaps a simple web
interface can be set up for such a tool to work against?




Reply | Threaded
Open this post in threaded view
|

Re: load testing a database

vonbecmann
In reply to this post by Tudor Girba-2
http://jmeter.apache.org/

p.d.: the only thing i know, i've used it to test a webservice

On Fri, Feb 3, 2012 at 12:11 PM, Tudor Girba <[hidden email]> wrote:
Hi,

Do you happen to know methods to approach the problem of testing the
capacity of an application to work with an SQL database?

In particular, I am interested in simulating concurrent requests
towards this database that would resemble the interaction coming from
the application.

Cheers,
Doru

--
www.tudorgirba.com

"Every thing has its own flow"


Reply | Threaded
Open this post in threaded view
|

Re: load testing a database

Philippe Marschall-2-3
In reply to this post by Tudor Girba-2
On 03.02.2012 16:11, Tudor Girba wrote:
> Hi,
>
> Do you happen to know methods to approach the problem of testing the
> capacity of an application to work with an SQL database?

Which capacity?

> In particular, I am interested in simulating concurrent requests
> towards this database that would resemble the interaction coming from
> the application.

What are you trying to measure?

Well you know that benchmarking is hard, don't you? I see two possible
ways. First recording the SQL statements and then replaying them (have
fun with bind parameters). Second just running the application itself.

Oh yeah, Oracle has a tool named RAT, no idea what the marketing
department though there. Maybe you can play similar tricks with PITR in
PostgreS.

Cheers
Philippe


Reply | Threaded
Open this post in threaded view
|

Re: load testing a database

Tudor Girba-2
Hi,

Sorry for the late reply, and thanks everyone for the suggestions.

I did not provide much details because I am new to this domain and I wanted to see from the reactions if maybe I am not missing some relevant direction.

We are trying to measure how an Oracle database can cope with an increase in usage (basically, there will be more users for the application). We are basing our analysis on typical SQL statements coming from the application. We are currently doing load testing by:
- recording sql statements from some use cases that are considered to be important
- generalizing them by replacing actual values with generic variables
- providing meaningful values for the generic variables
- replaying them against the database from several client machines
- consuming the first record from the responses
- reporting the timing of the statements
- recording the CPU, memory and I/O load of the server

However, I am interested in pitfalls, and in the way people interpret the results given that it is hard to determine what is a typical usage in terms of what statements to trigger and at what delays.

Cheers,
Doru



On 5 Feb 2012, at 19:02, Philippe Marschall wrote:

> On 03.02.2012 16:11, Tudor Girba wrote:
>> Hi,
>>
>> Do you happen to know methods to approach the problem of testing the
>> capacity of an application to work with an SQL database?
>
> Which capacity?
>
>> In particular, I am interested in simulating concurrent requests
>> towards this database that would resemble the interaction coming from
>> the application.
>
> What are you trying to measure?
>
> Well you know that benchmarking is hard, don't you? I see two possible ways. First recording the SQL statements and then replaying them (have fun with bind parameters). Second just running the application itself.
>
> Oh yeah, Oracle has a tool named RAT, no idea what the marketing department though there. Maybe you can play similar tricks with PITR in PostgreS.
>
> Cheers
> Philippe
>
>

--
www.tudorgirba.com

"Problem solving efficiency grows with the abstractness level of problem understanding."




Reply | Threaded
Open this post in threaded view
|

Re: load testing a database

Philippe Marschall-2-3
On 11.02.2012 19:30, Tudor Girba wrote:
> Hi,
>
> Sorry for the late reply, and thanks everyone for the suggestions.
>
> I did not provide much details because I am new to this domain and I wanted to see from the reactions if maybe I am not missing some relevant direction.
>
> We are trying to measure how an Oracle database can cope with an increase in usage (basically, there will be more users for the application).

Just more users or also more data? Will the users access the database
through one application or does each have it's own application?

> We are basing our analysis on typical SQL statements coming from the application. We are currently doing load testing by:
> - recording sql statements from some use cases that are considered to be important
> - generalizing them by replacing actual values with generic variables

That shouldn't be necessary, the queries should already contain bind
variables (unless your database layer is crap).

> - providing meaningful values for the generic variables
> - replaying them against the database from several client machines
> - consuming the first record from the responses

Why only the first?

> - reporting the timing of the statements
> - recording the CPU, memory and I/O load of the server

Oracle already provides tools for many of these things.

> However, I am interested in pitfalls, and in the way people interpret the results given that it is hard to determine what is a typical usage in terms of what statements to trigger and at what delays.

The yourself a competent Oracle-DBA and probably sysadmin and storage
guy as well. No seriously, you wouldn't want to have GemStone
benchmarked by someone who has never used Smalltak before, would you?

Cheers
Philippe


Reply | Threaded
Open this post in threaded view
|

Re: load testing a database

Tudor Girba-2
Hi,

On 11 Feb 2012, at 21:13, Philippe Marschall wrote:

> On 11.02.2012 19:30, Tudor Girba wrote:
>> Hi,
>>
>> Sorry for the late reply, and thanks everyone for the suggestions.
>>
>> I did not provide much details because I am new to this domain and I wanted to see from the reactions if maybe I am not missing some relevant direction.
>>
>> We are trying to measure how an Oracle database can cope with an increase in usage (basically, there will be more users for the application).
>
> Just more users or also more data? Will the users access the database through one application or does each have it's own application?

Both, but users is the more pressing problem. We already have enough data to expose problems. It's one application which is a legacy two-tier-Delphi-based wrapped with a three-tier-JEE.


>> We are basing our analysis on typical SQL statements coming from the application. We are currently doing load testing by:
>> - recording sql statements from some use cases that are considered to be important
>> - generalizing them by replacing actual values with generic variables
>
> That shouldn't be necessary, the queries should already contain bind variables (unless your database layer is crap).

Well, indeed, this should not happen, but legacy is never clean :). Anyway, there aren't that many cases.


>> - providing meaningful values for the generic variables
>> - replaying them against the database from several client machines
>> - consuming the first record from the responses
>
> Why only the first?

Because we are not interested in the response. Only to check that something gets returned. Is this a problem?


>> - reporting the timing of the statements
>> - recording the CPU, memory and I/O load of the server
>
> Oracle already provides tools for many of these things.
>
>> However, I am interested in pitfalls, and in the way people interpret the results given that it is hard to determine what is a typical usage in terms of what statements to trigger and at what delays.
>
> The yourself a competent Oracle-DBA and probably sysadmin and storage guy as well. No seriously, you wouldn't want to have GemStone benchmarked by someone who has never used Smalltak before, would you?

Thanks, we do have a competent Oracle specialist :). But, this being a tricky job, I thought of asking around for other experiences.


Cheers,
Doru



>
> Cheers
> Philippe
>
>

--
www.tudorgirba.com

Innovation comes in least expected form.
That is, if it is expected, it already happened.


Reply | Threaded
Open this post in threaded view
|

Re: load testing a database

Ben Coman

Just thinking laterally (since I don't have the experience to apply to
the analysis),  a practical business solution is to throw hardware at
the problem - particularly for a legacy application on old hardware.  
One site I worked at was having issues with the overnight posting of the
days accounts taking upwards of 12 hours.  The application had outgrown
its initial design.  This was brought down to 15 minutes by upgrading
the RAM and also to Serial Attached SCSI RAID10.   The other thing to
consider is the decreasing cost of solid state storage (which I've heard
of good stuff but haven't used myself yet)

http://www.mcobject.com/in_memory_database - non-Oracle but good overview
http://www.dba-oracle.com/t_ram_disk_performance.htm
http://www.ramsan.com/files/oracle_performance_tuning_with_ssd.pdf -
"The dramatic price/performance ratio of SSD is changing the way that
Oracle databases are tunes.  Sub-optimal Oracle databases no longer have
to undergo expensive and time-consuming re-design.  SSD technology is no
competing head-on with Oracle consulting dervices."

cheers, -ben


Tudor Girba wrote:

> Hi,
>
> On 11 Feb 2012, at 21:13, Philippe Marschall wrote:
>
>  
>> On 11.02.2012 19:30, Tudor Girba wrote:
>>    
>>> Hi,
>>>
>>> Sorry for the late reply, and thanks everyone for the suggestions.
>>>
>>> I did not provide much details because I am new to this domain and I wanted to see from the reactions if maybe I am not missing some relevant direction.
>>>
>>> We are trying to measure how an Oracle database can cope with an increase in usage (basically, there will be more users for the application).
>>>      
>> Just more users or also more data? Will the users access the database through one application or does each have it's own application?
>>    
>
> Both, but users is the more pressing problem. We already have enough data to expose problems. It's one application which is a legacy two-tier-Delphi-based wrapped with a three-tier-JEE.
>
>
>  
>>> We are basing our analysis on typical SQL statements coming from the application. We are currently doing load testing by:
>>> - recording sql statements from some use cases that are considered to be important
>>> - generalizing them by replacing actual values with generic variables
>>>      
>> That shouldn't be necessary, the queries should already contain bind variables (unless your database layer is crap).
>>    
>
> Well, indeed, this should not happen, but legacy is never clean :). Anyway, there aren't that many cases.
>
>
>  
>>> - providing meaningful values for the generic variables
>>> - replaying them against the database from several client machines
>>> - consuming the first record from the responses
>>>      
>> Why only the first?
>>    
>
> Because we are not interested in the response. Only to check that something gets returned. Is this a problem?
>
>
>  
>>> - reporting the timing of the statements
>>> - recording the CPU, memory and I/O load of the server
>>>      
>> Oracle already provides tools for many of these things.
>>
>>    
>>> However, I am interested in pitfalls, and in the way people interpret the results given that it is hard to determine what is a typical usage in terms of what statements to trigger and at what delays.
>>>      
>> The yourself a competent Oracle-DBA and probably sysadmin and storage guy as well. No seriously, you wouldn't want to have GemStone benchmarked by someone who has never used Smalltak before, would you?
>>    
>
> Thanks, we do have a competent Oracle specialist :). But, this being a tricky job, I thought of asking around for other experiences.
>
>
> Cheers,
> Doru
>
>
>
>  
>> Cheers
>> Philippe
>>
>>
>>    
>
> --
> www.tudorgirba.com
>
> Innovation comes in least expected form.
> That is, if it is expected, it already happened.
>
>
>
>  


Reply | Threaded
Open this post in threaded view
|

Re: load testing a database

Philippe Marschall-2-3
In reply to this post by Tudor Girba-2
On 11.02.2012 22:03, Tudor Girba wrote:

> Hi,
>
> On 11 Feb 2012, at 21:13, Philippe Marschall wrote:
>
>> On 11.02.2012 19:30, Tudor Girba wrote:
>>> Hi,
>>>
>>> Sorry for the late reply, and thanks everyone for the suggestions.
>>>
>>> I did not provide much details because I am new to this domain and I wanted to see from the reactions if maybe I am not missing some relevant direction.
>>>
>>> We are trying to measure how an Oracle database can cope with an increase in usage (basically, there will be more users for the application).
>>
>> Just more users or also more data? Will the users access the database through one application or does each have it's own application?
>
> Both, but users is the more pressing problem. We already have enough data to expose problems. It's one application which is a legacy two-tier-Delphi-based wrapped with a three-tier-JEE.

That sounds fun. Have you considered load testing through the
application instead of directly the database? If the application doesn't
scale then it doesn't help if the database does.

>>> We are basing our analysis on typical SQL statements coming from the application. We are currently doing load testing by:
>>> - recording sql statements from some use cases that are considered to be important
>>> - generalizing them by replacing actual values with generic variables
>>
>> That shouldn't be necessary, the queries should already contain bind variables (unless your database layer is crap).
>
> Well, indeed, this should not happen, but legacy is never clean :). Anyway, there aren't that many cases.
>
>
>>> - providing meaningful values for the generic variables
>>> - replaying them against the database from several client machines
>>> - consuming the first record from the responses
>>
>> Why only the first?
>
> Because we are not interested in the response. Only to check that something gets returned. Is this a problem?

Yes it is. I guess the application reads all rows so should you,
otherwise you're measuring something else than what the application does.

Imagine you're running
OPTIMIZER_MODE = ALL_ROWS
and stream over the result set containing tens of thousands of rows and
stream them into a file. Just looking a the first row isn't going to
tell you very much.


>>> - reporting the timing of the statements
>>> - recording the CPU, memory and I/O load of the server
>>
>> Oracle already provides tools for many of these things.
>>
>>> However, I am interested in pitfalls, and in the way people interpret the results given that it is hard to determine what is a typical usage in terms of what statements to trigger and at what delays.
>>
>> The yourself a competent Oracle-DBA and probably sysadmin and storage guy as well. No seriously, you wouldn't want to have GemStone benchmarked by someone who has never used Smalltak before, would you?
>
> Thanks, we do have a competent Oracle specialist :). But, this being a tricky job, I thought of asking around for other experiences.

Yes it is.

Cheers
Philippe


Reply | Threaded
Open this post in threaded view
|

Re: load testing a database

Philippe Marschall-2-3
In reply to this post by Ben Coman
On 12.02.2012 02:58, Ben Coman wrote:

>
> Just thinking laterally (since I don't have the experience to apply to
> the analysis), a practical business solution is to throw hardware at the
> problem - particularly for a legacy application on old hardware. One
> site I worked at was having issues with the overnight posting of the
> days accounts taking upwards of 12 hours. The application had outgrown
> its initial design. This was brought down to 15 minutes by upgrading the
> RAM and also to Serial Attached SCSI RAID10. The other thing to consider
> is the decreasing cost of solid state storage (which I've heard of good
> stuff but haven't used myself yet)

While I wouldn't argue against buying hardware I would first make sure I
understand the problem so I know what to spend my money on. You also
want to make sure you're not suffering from something "stupid" like a
configuration error or a missing index. There are sometimes small and
cheap changes that can have a dramatic impact.

If you're running 5 year old hardware sure, absolutely consider an
upgrade. Sure, SSDs can help dramatically.

Cheers
Philippe