Custom Search : Technical Guides Sybase Oracle UNIX Javascript

Technical Guides

Of Interest

Be responsible for your future
Enter the USA legally!

Visa, Green Card, Citizenship, Passport
Consultation Services

Sybase » Transact-SQL » Coding » Performance

Reporting: SQL Performance and Tuning


This is a list of some techniques used successfully at several different sites.

Getting Maximum Index Usage

1) Verify tables have had "update statistics" on them ;
Verify tables have had "sp_recompile" on them.

2) Verify any declared variables have the same data
type as their corresponding columns - this is a common

3) Force index usage as follows, with a hint:
from customer (index idx_customer2)

Example: set table count 6
Then, compile the procedure, in the same session.

5) If temp tables are being used, put the temp table
creation statements in one procedure, and the
processing SQL in another procedure. This allows
the optimizer to form a query plan on the already
established tables.
proc_driver calls proc_create_temp_tables
proc_driver calls proc_generate_report

General SQL Programming

- Plan for growth. Assume the driver table doubled or tripled in size; would
the report still function ?

- Avoid dumb comparisons in the where clause, like
where @emp_id > 0

- use "WHERE EXISTS ( )" rather than "WHERE NOT EXISTS"

- use "!=" rather than "<>"

- use "IS NOT NULL" rather than "<>NULL"

- use "IS NULL" rather than "=NULL"

- avoid distinct if possible ; see cursor loop option below

- use meaningful names for temp tables ... don't use #temp (lame)

Report Structure Approaches

1) Single query

Single query reports are rare - usually they involve getting a simple list

- Don't try to 'shoehorn' SQL into one statement. Shorter programs are
great for C or Perl applications, but this is not the case in SQL.
Think "Bigger is Better" (and more maintainable).

- Keep queries from using more than four tables if possible.

2) Cursor on driver table(s), with IF..THEN processing in loop

Using a cursor for complex reports almost always increases performance
when large tables and a lot of joins are involved.

- Keep cursor queries from using more than two tables if possible,
make sure this query performs well on its own.

- Try to have a unique key of some sort available within the tables involved.
Strange results have been known to occur when a cursor is scanning
rows that are exactly alike.

- Don't use cursors for updating.

- Use IF statements for filtering results even further. In most cases:

A code construct like the one below is better than cramming the
logic in a where clause.

IF <cond-a>

IF <cond-b> and <cond-c>


3) Set processing without cursors

This technique should be attempted when even a cursor construct fails to
achieve the desired performance.

Basically, the driver query is re-run with each iteration of the loop.

Sample, with cursor:

declare cursor1 cursor for
select emp_id, last_name, salary
from employee

open cursor1

fetch cursor1 into @emp_id, @last_name, @salary

while (@@sqlstatus = 0)

< processing >

fetch cursor1 into @emp_id, @last_name, @salary

close cursor1

Sample, with set processing:

select @emp_id = 0, @loop = 1

while (@loop > 0)

set rowcount 1

@emp_id = emp_id,
@last_name = last_name,
@salary = salary
from employee
where emp_id > @emp_id
order by 1

select @loop = @@rowcount

set rowcount 0

if @loop > 0

< processing >



Transaction Log Filling Up ?

If the transaction log is filling up, for tempdb or the main database, there
is likely something wrong with the report logic.

Things to check:

- Instead of repetitively updating each row, can the values be obtained
ahead of time, and then inserted with a single transaction ?

- Are the "joined" updates occuring on each row once ? When updating
using a join statement, make sure that the tables in question
are joined in a way that avoids duplicate rows. Try running the
SQL statement as a SELECT - check it out.

- Are you cramming 500,000 rows from a temp table into a db table ?
Try elminating the temp table.

- Create indexes on updated/inserted tables after the fact.

- Use "set rowcount" along with "waitfor delay" if log problems persist

*** A proper DBA will never extend the log segment, based on the needs of a
single process.

Sybase : Related Topics

Sybase : Transact-SQL : Transact SQL: numeric functions
Sybase : Transact-SQL : Transact SQL: string functions
Sybase : Transact-SQL : Transact SQL: date/time functions
Sybase : Transact-SQL : Transact SQL: misc functions
Sybase : Transact-SQL : Transact SQL: Conditionals
Sybase : Transact-SQL : Transact SQL: looping constructs
Sybase : Transact-SQL : Transact SQL: Cursors
Sybase : Transact-SQL : Transact SQL: Complex Updates
Sybase : Transact-SQL : Transact SQL: Finding duplicate rows in a table
Sybase : Transact-SQL : Using Temporary Tables
Sybase : Transact-SQL : Inner/Outer Joins
Sybase : Transact-SQL : Isolation Levels
Sybase : Transact-SQL : Case Statement
Sybase : Transact-SQL : Date processing: stripping time from datetime
Sybase : Transact-SQL : Safely delete a large number of rows without blowing the t-log
Sybase : Transact-SQL : Transact SQL: date/time formats
Sybase : Transact-SQL : Creating a stored procedure
Sybase : Transact-SQL : Custom query plans
Sybase : Transact-SQL : Rowcount setting - limiting result sets
Sybase : Transact-SQL : Ranking Data - with Duplicates
Sybase : Transact-SQL : Forcing an index in a query
Sybase : Transact-SQL : Median Calculation, with pure SQL
Sybase : Transact-SQL : Lead and Lag with pure SQL

Sybase Web Site
Sybase iAnywhere Mobile Web Site
Oracle Enterprise Web Site

Get the latest Rocket99 news and tech tips via

Site Index About this Guide to Sybase, Oracle, and UNIX Contact Us Advertise on this site

Copyright © 2019 Stoltenbar Inc All Rights Reserved.