query performance tuning


In the database we have a table where all the file names are stored.This table is queried every 1 minute by the application to see a file name already exists or not, So the query is designed as,

    select filename from the filenametable where filename like ('%abctext%')
The filename is around 190 charecters in length and there are about 100k records in the table and it is effecting performance, I have put an index on the filename, but still it does not show a significant change on the perfromance. All 100k records needs to stay in the table, for a month or so. but even for first 100k itself, i see it cracking up. any way we can fine tune this query?
more ▼

asked Aug 31, 2011 at 01:33 PM in Default

Katie 1 gravatar image

Katie 1
1.4k 132 163 202

Can you share the query plan? (if it doesnt have any sensitive info, ofcourse)
Aug 31, 2011 at 01:41 PM DaniSQL

SQL Server when dealing with character-based columns, starts the index search based on the letters starting from the left (if we're talking about English). Therefore, LIKE 'abctext%' would potentially use an index but that assumes the file name starts with abctext. As soon as you start with % (LIKE '%abctext' or LIKE '%abctext%'), you're telling SQL Server 0 or more characters before abctext, at which point it's going to have to look at the entry for that column for each row.

Think about trying to go through the phonebook knowing that "hit" was in the name, but you didn't know what characters were before or after. You could match this with White, Whit, Whitehead, Chittering, or Doughit. How exactly do you find all matches in the phone book without going through every entry? You don't. You have to consider every single one. And that's why SQL Server won't choose to use an index or be able to make good use for one even if it does select it.
Sep 01, 2011 at 12:42 PM K. Brian Kelley
How long is it taking? I have a 264,000 filename table and the same query takes less than 100ms
Sep 02, 2011 at 04:14 PM Scot Hauder
The problem is, the application pings every 1 min to that table and trys to check if there are any files existing. This is causing a overhead. As we have a requirement to process the files as soon as the file is available, so i cannot mess with the timing.
Sep 05, 2011 at 03:48 PM Katie 1
(comments are locked)
10|1200 characters needed characters left

7 answers: sort newest

Are the file searches all wild card searches. I can just type in 'a' and you're going to find aardvark, apple, can, zap...? For 100,000 records? Talk to the business. Convince them that the only way to get for sure performance is to modify the requirement so that if you type in 'a' the search will only return aardvark & apple.

That, or, if this is really the way this must behave, you may want to look at one of the NoSQL databases out there that are specifically targeted to doing wild card text searches. Especially if you're only doing a name/value pair search as the example query implies.
more ▼

answered Sep 01, 2011 at 07:49 AM

Grant Fritchey gravatar image

Grant Fritchey ♦♦
103k 19 21 74

(comments are locked)
10|1200 characters needed characters left
It sounds like your business needs a better naming convention for file names. What you are doing is very inefficient...
more ▼

answered Nov 16, 2011 at 02:28 AM

davehants gravatar image


(comments are locked)
10|1200 characters needed characters left

Apparently I don't have enough "karma" to comment on an individual post... Anyway, I may be wrong, but I don't believe a fulltext index is applicable in this case for two reasons.

I don't think you're allowed to search inside a given "word," the closest is the beginning of the word: http://msdn.microsoft.com/en-us/library/cc879300.aspx

Even if you could, I expect the full text index query would be just as expensive as the like query in this case, because the filenames are probably not made of many words.

Someone please correct me if I'm wrong...
more ▼

answered Sep 01, 2011 at 01:29 PM

weitzera gravatar image


(comments are locked)
10|1200 characters needed characters left

Is there a pattern to the filenames that could be used to create a computed column from? You say the files all have the same pattern inside them, the rest is different every minute. If that is so, you could create the computed column to fish out the relevant filename part, then index that column. You then change your query to access the computed column and it should fly.

@Kevin feasel's suggestion of full text indexing would be my next step if my suggestion didn't work out.
more ▼

answered Sep 01, 2011 at 12:41 PM

WilliamD gravatar image

25.9k 17 19 41

just realised that ist exactly what @weitzera wrote almost a whole day ago! Stupid me, must read all answers before posting
Sep 01, 2011 at 12:43 PM WilliamD
(comments are locked)
10|1200 characters needed characters left

I'd look into using a [full-text index][1] for this query. If you have the server power to keep that index up-to-date, performance would be much faster than the %something% search for the reason @DaniSQL brought up. Rebuilding the index can be pretty slow, but it might be worth giving this a try.

[1]: http://msdn.microsoft.com/en-us/library/ms142571.aspx
more ▼

answered Sep 01, 2011 at 10:49 AM

Kevin Feasel gravatar image

Kevin Feasel
6.1k 3 5 11

(comments are locked)
10|1200 characters needed characters left
Your answer
toggle preview:

Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.

New code box

There's a new way to format code on the site - the red speech bubble logo will automatically format T-SQL for you. The original code box is still there for XML, etc. More details here.

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here



Answers and Comments

SQL Server Central

Need long-form SQL discussion? SQLserverCentral.com is the place.



asked: Aug 31, 2011 at 01:33 PM

Seen: 1727 times

Last Updated: Aug 31, 2011 at 01:33 PM