Saturday, February 25, 2012

possible Scope_Identity() problem

I have an ASP.NET application with a SQL Server 2000 backend, where two
pages fire off two different stored procedures. Each stored procedure
creates a new record in a particular table then uses Scope_Identity() to get
the id of the newly created record for adding it to a link table. The
problem I'm seeing appears to be Scope_Identity() behaving as thought it
were @.@.IDENTITY, i.e. on the occasion when both procedures are fired at once
(different machines and browsers, not that it should matter), one procedure
appears to get the id of the record created by the other and essentially
steal its record.
Has anyone come across anything like this before?
thanks in advance,
--
jo inferisHi
Do you have code to reproduce this issue?
I only way I could think this happens is if both SP's are executed on the
same connection (which is unlikely).
Regards
Mike
"Jo Inferis" wrote:

> I have an ASP.NET application with a SQL Server 2000 backend, where two
> pages fire off two different stored procedures. Each stored procedure
> creates a new record in a particular table then uses Scope_Identity() to g
et
> the id of the newly created record for adding it to a link table. The
> problem I'm seeing appears to be Scope_Identity() behaving as thought it
> were @.@.IDENTITY, i.e. on the occasion when both procedures are fired at on
ce
> (different machines and browsers, not that it should matter), one procedur
e
> appears to get the id of the record created by the other and essentially
> steal its record.
> Has anyone come across anything like this before?
> thanks in advance,
> --
> jo inferis
>
>|||Hi Jo,
Use IDENT_CURRENT('table_name') to get the Identity value. Because
IDENT_CURRENT returns the last identity value generated for a specific table
in any session and any scope.
@.@.IDENTITY returns the last identity value generated for any table in the
current session, across all scopes.
SCOPE_IDENTITY returns the last identity value generated for any table in
the current session and the current scope.
Regards
Sivakumar
"Jo Inferis" wrote:

> I have an ASP.NET application with a SQL Server 2000 backend, where two
> pages fire off two different stored procedures. Each stored procedure
> creates a new record in a particular table then uses Scope_Identity() to g
et
> the id of the newly created record for adding it to a link table. The
> problem I'm seeing appears to be Scope_Identity() behaving as thought it
> were @.@.IDENTITY, i.e. on the occasion when both procedures are fired at on
ce
> (different machines and browsers, not that it should matter), one procedur
e
> appears to get the id of the record created by the other and essentially
> steal its record.
> Has anyone come across anything like this before?
> thanks in advance,
> --
> jo inferis
>
>|||Subramaniam Sivakumar wrote:
> Use IDENT_CURRENT('table_name') to get the Identity value.
That's not going to help, both identities are created in the same table.
Obviously I didn't make that clear enough.
jo inferis|||Mike Epprecht (SQL MVP) wrote:
> Do you have code to reproduce this issue?
It's a little difficult to extract the code to reproduce it, and this is
only a vague possibility anyway. I was just wondering if there might have
been something i'd missed in the usage of Scope_Identity().

> I only way I could think this happens is if both SP's are executed on
> the same connection (which is unlikely).
I did think about that, but even then, the scope in each case should be
different, shouldn't it?
jo inferis|||Is scope_identity() is returning the same identity of the other stored proc
perhaps you're application is using connection pooling. I'm guessing that
would explain how 2 different pages would end up being in the same "scope".|||I don't see how this can happen but maybe this will help:
Wrap all inserts in your stored procedure in a transaction.
We haven't seen any code so you may be doing this anyway.
"Jo Inferis" <jo@.inferis.NOSPAM.gotadsl.co.uk> wrote in message
news:%23LYkTseNFHA.1176@.TK2MSFTNGP12.phx.gbl...
> Mike Epprecht (SQL MVP) wrote:
> It's a little difficult to extract the code to reproduce it, and this is
> only a vague possibility anyway. I was just wondering if there might have
> been something i'd missed in the usage of Scope_Identity().
>
> I did think about that, but even then, the scope in each case should be
> different, shouldn't it?
> --
> jo inferis
>

Possible Primary Key Problem

I am struggling to get a relationship to work in my database and I cannot
get my head round the problem so I wondered if any of you knowledgeable
people on here knew the answer.
An image of my Diagram is here
(http://www.step-online.org.uk/relationship.png) showing some of the
relationships.
Everything works find as is.
I am now trying to create a many-to-many-to-many type of relationship (if
this is even possible).
My courses tables work fine as a many-to-many relationship.
I have created a new join table so I can link individuals to courses (using
the unique ID column in the Courses join table).
All the fields ending _ID are nnumeric data type with an identity set to
increment by 1 each time.
I have tried to create a relationship between XREF_Courses_ID in the
SYS_Xref_Join_Ind_Courses table and the same field (both primary keys) in
the SYS_Xref_Join_Courses table, and receive the following error:
"The columns in table 'SYS_Xref_Join_Courses' do not match an existing
primary key or UNIQUE constraint".
Why would this be?On Tue, 6 Apr 2004 12:22:53 +0100, "Keith" <@..> wrote:

>I am struggling to get a relationship to work in my database and I cannot
>get my head round the problem so I wondered if any of you knowledgeable
>people on here knew the answer.
>An image of my Diagram is here
>(http://www.step-online.org.uk/relationship.png) showing some of the
>relationships.
>Everything works find as is.
>I am now trying to create a many-to-many-to-many type of relationship (if
>this is even possible).
>My courses tables work fine as a many-to-many relationship.
>I have created a new join table so I can link individuals to courses (using
>the unique ID column in the Courses join table).
>All the fields ending _ID are nnumeric data type with an identity set to
>increment by 1 each time.
Aaarrgghhhhhh...

>I have tried to create a relationship between XREF_Courses_ID in the
>SYS_Xref_Join_Ind_Courses table and the same field (both primary keys) in
>the SYS_Xref_Join_Courses table, and receive the following error:
>"The columns in table 'SYS_Xref_Join_Courses' do not match an existing
>primary key or UNIQUE constraint".
>Why would this be?
>
Because the columns in that table do indeed not match an existing
primary key. Both tables hava a primary key over multiple columns, so
you can't use just one of these columns to create a relationship.
Explanation: any foreign key (which is what a relationship translates
to) needs to join on at least one side to a column or column
combination that identifies at most one row. Generally the primary
key; optionally a column combination with UNIQUE constraint (a
candidate key).
A many-to-many relationship can only be implemented through an extra
table. For the relationship between courses and individuals, something
like:
CREATE TABLE CourseSubscriptions
(IND_ID int not null,
COURSE_ID int not null,
PRIMARY KEY(IND_ID, COURSE_ID),
FOREIGN KEY(INT_ID)
references SYS_Individual(INT_ID),
FOREIGN KEY(COURSE_ID)
references SYS_Courses(COURSE_ID))
On second thoughts - it seems as if the Xref-Join-Ind-Courses (who
comes up with such names?) is already made for this purpose. Just get
rid of the superfluous XREF_Ind_Course_ID and create the one missing
foreign key constraint.
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)

Possible pass parameters on "Jump to URL" properties?

Hi,

I use the " JUMP to URL" properties and the code as follow:

=SWITCH(Fields!KpiCode.Value=Fields!KpiCode.Value,void(window.open('"+First(Fields!ServerString.Value,
"HostInfo")+"/L3/"+Fields!KpiCode.Value+".aspx','"+First(Fields!PropertyString.Value,
"HostInfo")+"'))")

I want to know is any way or any possible way could pass the report
parameter on this kind of porperties or this code?
For some requirement, I just can use Jump to URL, but hope can pass the
parameters on "jump to url" properties.

Thanks for any advice!
Angi

Similar to Fields, you can pass parameters.

Parameters!ParameterName.Value

|||

Brad,

Thanks for reply!

It's my mistake didn't descript clearly.

This code on IE status bar will generate as

=void(window.open('http://anginb:8022/L3/Bu01301200.aspx','_blank','toolbar=no,location=no,directories=no,status=no,menubar=no,scrollbar=yes,resizable=yes,copyhistory=no'))"

And what I want is pass the report parameter throght this .aspx url like this:

=void(window.open('http://anginb:8022/L3/Bu01301200.aspx?puog=01','_blank','toolbar=no,location=no,directories=no,status=no,menubar=no,scrollbar=yes,resizable=yes,copyhistory=no'))"

puog is a report parameter!

Could I use report parameter on url as a url parameter?

Thanks for advice!

|||Sure, this is possible, look in the BOL qor the online version [1], there are samples about passing URL parameters.

[1] http://msdn2.microsoft.com/en-us/library/ms153586.aspx

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de

|||

you can pass report parameter in the URL if you are using URL Access of Reporting Services.

Basically, when using URL Access to visit the report, you need to first specify the report server virtual directory An example looks like:

http://localhost/ReportServer?

After that, you need to append the path of the report to the URL. Assume you have installed the sample report and wants to view the Employee Sales Summary report:

http://localhost/ReportServer?/SampleReports/Employee Sales Summary

And then, you need to append a report command so that the Report Server knows that you want to render the report:

http://localhost/ReportServer?/SampleReports/Employee Sales Summary&rs:command=render

Finally, you can append the value of the parameter to the URL:

http://localhost/ReportServer?/SampleReports/Employee Sales Summary&rs:command=render&EmpID=20&ReportMonth=12&ReportYear=2003

|||

Jens, Yicong,

Thanks a lot!

Simply!

Could I use any parameter after http://localhost:8022/L3/Bu01301200.aspx ? and how to do ?

Thanks!

|||If this is a external (non reporting services) page you have to grab those paramters through the QueryString collection and pass it again to the reporting services parameter like mentioned above.

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de
|||Thanks for Jens.|||

Can anybody plz tell me how to pass multiple values in open() method to next page,

Actually I want to know the seperator between the different parameter passed

the code i wrote is somewhat this kind

function addr()
{
open("edoc_add.asp?Docno=<%=DaNo%> Address=<%=Daln%> Pin=<%=DaPin%> Tel=<%=DaTel%> Mob=<%=DaMob%> Mark=<%=DaRm%> mail=<%=Damail%> url=<%=DaUrl%>","AddressInfo","toolbar=No,StatusBar=No,Resize=Yes,Scrollbars=Yes,width=600,height=400")
}

I m calling it on click event of anchor tag.

plz reply soon.............

thnkx in advanced.

|||

Its alway the paramter name:

ParamName=1&ParamName=2 ...

Jens.

Possible pass parameters on "Jump to URL" properties?

Hi,

I use the " JUMP to URL" properties and the code as follow:

=SWITCH(Fields!KpiCode.Value=Fields!KpiCode.Value,void(window.open('"+First(Fields!ServerString.Value,
"HostInfo")+"/L3/"+Fields!KpiCode.Value+".aspx','"+First(Fields!PropertyString.Value,
"HostInfo")+"'))")

I want to know is any way or any possible way could pass the report
parameter on this kind of porperties or this code?
For some requirement, I just can use Jump to URL, but hope can pass the
parameters on "jump to url" properties.

Thanks for any advice!
Angi

Similar to Fields, you can pass parameters.

Parameters!ParameterName.Value

|||

Brad,

Thanks for reply!

It's my mistake didn't descript clearly.

This code on IE status bar will generate as

=void(window.open('http://anginb:8022/L3/Bu01301200.aspx','_blank','toolbar=no,location=no,directories=no,status=no,menubar=no,scrollbar=yes,resizable=yes,copyhistory=no'))"

And what I want is pass the report parameter throght this .aspx url like this:

=void(window.open('http://anginb:8022/L3/Bu01301200.aspx?puog=01','_blank','toolbar=no,location=no,directories=no,status=no,menubar=no,scrollbar=yes,resizable=yes,copyhistory=no'))"

puog is a report parameter!

Could I use report parameter on url as a url parameter?

Thanks for advice!

|||Sure, this is possible, look in the BOL qor the online version [1], there are samples about passing URL parameters.

[1] http://msdn2.microsoft.com/en-us/library/ms153586.aspx

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de

|||

you can pass report parameter in the URL if you are using URL Access of Reporting Services.

Basically, when using URL Access to visit the report, you need to first specify the report server virtual directory An example looks like:

http://localhost/ReportServer?

After that, you need to append the path of the report to the URL. Assume you have installed the sample report and wants to view the Employee Sales Summary report:

http://localhost/ReportServer?/SampleReports/Employee Sales Summary

And then, you need to append a report command so that the Report Server knows that you want to render the report:

http://localhost/ReportServer?/SampleReports/Employee Sales Summary&rs:command=render

Finally, you can append the value of the parameter to the URL:

http://localhost/ReportServer?/SampleReports/Employee Sales Summary&rs:command=render&EmpID=20&ReportMonth=12&ReportYear=2003

|||

Jens, Yicong,

Thanks a lot!

Simply!

Could I use any parameter after http://localhost:8022/L3/Bu01301200.aspx ? and how to do ?

Thanks!

|||If this is a external (non reporting services) page you have to grab those paramters through the QueryString collection and pass it again to the reporting services parameter like mentioned above.

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de|||Thanks for Jens.|||

Can anybody plz tell me how to pass multiple values in open() method to next page,

Actually I want to know the seperator between the different parameter passed

the code i wrote is somewhat this kind

function addr()
{
open("edoc_add.asp?Docno=<%=DaNo%> Address=<%=Daln%> Pin=<%=DaPin%> Tel=<%=DaTel%> Mob=<%=DaMob%> Mark=<%=DaRm%> mail=<%=Damail%> url=<%=DaUrl%>","AddressInfo","toolbar=No,StatusBar=No,Resize=Yes,Scrollbars=Yes,width=600,height=400")
}

I m calling it on click event of anchor tag.

plz reply soon.............

thnkx in advanced.

|||

Its alway the paramter name:

ParamName=1&ParamName=2 ...

Jens.

Possible newbie SQL question. Help.

I have the below table

Organization:
emp_id, emp_name, manager_id, level

level 3 employees report to level 2 employees than report to level 1 employees

I need to select all level 3 employees, their direct manager (level 2) and indirect manager (level1).

Any help?Treat each level as a separate table:

select l3.emp_name, l2.emp_name, l1.emp_name
from emp l1, emp l2, emp l3
where l1.emp_id = l2.manager_id
and l2.emp_id = l3.manager_id;

Or in Oracle there is "CONNECT BY" to perform a tree-structured query.

Possible Network Error: Write to SQL Server Failed

Any idea on what can cause this error message? "Possible
Network Error: Write to SQL Server Failed"
Is it due to a network disconnect? We are running the
application thru Metaframe XP to SQL Server 2000 with SP 3.
Thanks
Hi
Googling for "Write to SQL Server Failed" turns up lots of hits including:
http://support.microsoft.com/default...b;en-us;109787
http://support.microsoft.com/?id=259775
http://support.microsoft.com/?id=199105
http://support.microsoft.com/?id=315938
It would be useful to know the protocols being used.
John
"mikef" <anonymous@.discussions.microsoft.com> wrote in message
news:1d41c01c453b4$0fcd2da0$a501280a@.phx.gbl...
> Any idea on what can cause this error message? "Possible
> Network Error: Write to SQL Server Failed"
> Is it due to a network disconnect? We are running the
> application thru Metaframe XP to SQL Server 2000 with SP 3.
> Thanks

Possible Network Error: Write to SQL Server Failed

Any idea on what can cause this error message? "Possible
Network Error: Write to SQL Server Failed"
Is it due to a network disconnect? We are running the
application thru Metaframe XP to SQL Server 2000 with SP 3.
ThanksHi
Googling for "Write to SQL Server Failed" turns up lots of hits including:
http://support.microsoft.com/defaul...kb;en-us;109787
http://support.microsoft.com/?id=259775
http://support.microsoft.com/?id=199105
http://support.microsoft.com/?id=315938
It would be useful to know the protocols being used.
John
"mikef" <anonymous@.discussions.microsoft.com> wrote in message
news:1d41c01c453b4$0fcd2da0$a501280a@.phx
.gbl...
> Any idea on what can cause this error message? "Possible
> Network Error: Write to SQL Server Failed"
> Is it due to a network disconnect? We are running the
> application thru Metaframe XP to SQL Server 2000 with SP 3.
> Thanks

Possible memory leak definite error when using CONVERT statement

Hi all,
I have a problem with my SQL Server 2000 installation that I can only think
might be a memory leak.
I was wondering if any of you have experienced similar problems and if so
how you overcame them?
Within 10 days of the server running we are having to restart the server to
flush a variable test out of memory.
The error appears when running SQL from Query Analyser, Stored procedures
called from aspx pages etc.
The error looks something like
"Unable to convert variable test to an integer variable"
As I'm sure you can guess this error only occurs when using CONVERT but rest
assured there is no variable test declared by any of the scripts we are
running on the server.
BRB
That's strange
Can you show us this script?
"BPBForum" <BPBForum@.discussions.microsoft.com> wrote in message
news:CE1DC59F-5035-429D-887A-0DAA55C0E102@.microsoft.com...
> Hi all,
> I have a problem with my SQL Server 2000 installation that I can only
think
> might be a memory leak.
> I was wondering if any of you have experienced similar problems and if so
> how you overcame them?
> Within 10 days of the server running we are having to restart the server
to
> flush a variable test out of memory.
> The error appears when running SQL from Query Analyser, Stored procedures
> called from aspx pages etc.
> The error looks something like
> "Unable to convert variable test to an integer variable"
> As I'm sure you can guess this error only occurs when using CONVERT but
rest
> assured there is no variable test declared by any of the scripts we are
> running on the server.
|||Hi
SQL Version and SP level, and we need to see the code where this occurs?
Don't forget, SQL Server does a lot of implicit conversions so explicit use
of CONVERT is not required.
Regards
Mike
"BPBForum" wrote:

> Hi all,
> I have a problem with my SQL Server 2000 installation that I can only think
> might be a memory leak.
> I was wondering if any of you have experienced similar problems and if so
> how you overcame them?
> Within 10 days of the server running we are having to restart the server to
> flush a variable test out of memory.
> The error appears when running SQL from Query Analyser, Stored procedures
> called from aspx pages etc.
> The error looks something like
> "Unable to convert variable test to an integer variable"
> As I'm sure you can guess this error only occurs when using CONVERT but rest
> assured there is no variable test declared by any of the scripts we are
> running on the server.

Possible memory leak definite error when using CONVERT statement

Hi all,
I have a problem with my SQL Server 2000 installation that I can only think
might be a memory leak.
I was wondering if any of you have experienced similar problems and if so
how you overcame them?
Within 10 days of the server running we are having to restart the server to
flush a variable test out of memory.
The error appears when running SQL from Query Analyser, Stored procedures
called from aspx pages etc.
The error looks something like
"Unable to convert variable test to an integer variable"
As I'm sure you can guess this error only occurs when using CONVERT but rest
assured there is no variable test declared by any of the scripts we are
running on the server.BRB
That's strange
Can you show us this script?
"BPBForum" <BPBForum@.discussions.microsoft.com> wrote in message
news:CE1DC59F-5035-429D-887A-0DAA55C0E102@.microsoft.com...
> Hi all,
> I have a problem with my SQL Server 2000 installation that I can only
think
> might be a memory leak.
> I was wondering if any of you have experienced similar problems and if so
> how you overcame them?
> Within 10 days of the server running we are having to restart the server
to
> flush a variable test out of memory.
> The error appears when running SQL from Query Analyser, Stored procedures
> called from aspx pages etc.
> The error looks something like
> "Unable to convert variable test to an integer variable"
> As I'm sure you can guess this error only occurs when using CONVERT but
rest
> assured there is no variable test declared by any of the scripts we are
> running on the server.|||Hi
SQL Version and SP level, and we need to see the code where this occurs?
Don't forget, SQL Server does a lot of implicit conversions so explicit use
of CONVERT is not required.
Regards
Mike
"BPBForum" wrote:
> Hi all,
> I have a problem with my SQL Server 2000 installation that I can only think
> might be a memory leak.
> I was wondering if any of you have experienced similar problems and if so
> how you overcame them?
> Within 10 days of the server running we are having to restart the server to
> flush a variable test out of memory.
> The error appears when running SQL from Query Analyser, Stored procedures
> called from aspx pages etc.
> The error looks something like
> "Unable to convert variable test to an integer variable"
> As I'm sure you can guess this error only occurs when using CONVERT but rest
> assured there is no variable test declared by any of the scripts we are
> running on the server.

Possible memory leak definite error when using CONVERT statement

Hi all,
I have a problem with my SQL Server 2000 installation that I can only think
might be a memory leak.
I was wondering if any of you have experienced similar problems and if so
how you overcame them?
Within 10 days of the server running we are having to restart the server to
flush a variable test out of memory.
The error appears when running SQL from Query Analyser, Stored procedures
called from aspx pages etc.
The error looks something like
"Unable to convert variable test to an integer variable"
As I'm sure you can guess this error only occurs when using CONVERT but rest
assured there is no variable test declared by any of the scripts we are
running on the server.BRB
That's strange
Can you show us this script?
"BPBForum" <BPBForum@.discussions.microsoft.com> wrote in message
news:CE1DC59F-5035-429D-887A-0DAA55C0E102@.microsoft.com...
> Hi all,
> I have a problem with my SQL Server 2000 installation that I can only
think
> might be a memory leak.
> I was wondering if any of you have experienced similar problems and if so
> how you overcame them?
> Within 10 days of the server running we are having to restart the server
to
> flush a variable test out of memory.
> The error appears when running SQL from Query Analyser, Stored procedures
> called from aspx pages etc.
> The error looks something like
> "Unable to convert variable test to an integer variable"
> As I'm sure you can guess this error only occurs when using CONVERT but
rest
> assured there is no variable test declared by any of the scripts we are
> running on the server.|||Hi
SQL Version and SP level, and we need to see the code where this occurs?
Don't forget, SQL Server does a lot of implicit conversions so explicit use
of CONVERT is not required.
Regards
Mike
"BPBForum" wrote:

> Hi all,
> I have a problem with my SQL Server 2000 installation that I can only thin
k
> might be a memory leak.
> I was wondering if any of you have experienced similar problems and if so
> how you overcame them?
> Within 10 days of the server running we are having to restart the server t
o
> flush a variable test out of memory.
> The error appears when running SQL from Query Analyser, Stored procedures
> called from aspx pages etc.
> The error looks something like
> "Unable to convert variable test to an integer variable"
> As I'm sure you can guess this error only occurs when using CONVERT but re
st
> assured there is no variable test declared by any of the scripts we are
> running on the server.

Possible IO Problems

Hi guys

We have a 23GB SQL database and we are experiencing some pretty dire performance problems.

I think this may be related to the disk setup.

We have 3 physical SCSI disks on 1 array which make the C and D drive. (RAID 5)

We also have 3 SCSI disks (also RAID 5) which makes the E drive, now this E drive contains all the database files and all the transaction log files.

On a daily basis we are adding approximately 83,000 records.

Database activity is nearly always high as queries are running to generate results and the same server is also serving browser requests via IIS to display the data for users.

Now I have suggested that we split the database files and the transaction log files onto 2 seperate physical disks.

In this server we will also have 4 70GB SCSI disks, these will be split into 2 arrays using RAID 1. Giving me 2 logical drives of 70GB each. I proposed putting the database files on one and the transaction log files on the other.

We maintain our redundancy as both disks are mirrored and we also have offsite backups.

Will we notice improvement by removing the database files and the transaction log files off the same RAID 5 array and onto seperate RAID 1 arrays (what are peoples opinions regarding SQL database files on RAID 5?)

Many thanks for you help guys

HanleyOriginally posted by Hanley
Hi guys

We have a 23GB SQL database and we are experiencing some pretty dire performance problems.

I think this may be related to the disk setup.

We have 3 physical SCSI disks on 1 array which make the C and D drive. (RAID 5)

We also have 3 SCSI disks (also RAID 5) which makes the E drive, now this E drive contains all the database files and all the transaction log files.

On a daily basis we are adding approximately 83,000 records.

Database activity is nearly always high as queries are running to generate results and the same server is also serving browser requests via IIS to display the data for users.

Now I have suggested that we split the database files and the transaction log files onto 2 seperate physical disks.

In this server we will also have 4 70GB SCSI disks, these will be split into 2 arrays using RAID 1. Giving me 2 logical drives of 70GB each. I proposed putting the database files on one and the transaction log files on the other.

We maintain our redundancy as both disks are mirrored and we also have offsite backups.

Will we notice improvement by removing the database files and the transaction log files off the same RAID 5 array and onto seperate RAID 1 arrays (what are peoples opinions regarding SQL database files on RAID 5?)

Many thanks for you help guys

Hanley

Its been a while since I've had to beat on this one. So bear with me while I throw some questions at you.

What is the general setup of the server? CPU(s)? RAM?
How many SCSI controllers do you have?
How much memory do they have?
How fast are the SCSI drives?
Can you do replication of the database to another server for read only IIS?

RAID 5 is typically faster than RAID 1 in the real world. And if your choke point is the controller, throwing all the drives in the world at it isn't going to help.

In addition even splitting the disks for transaction logs and the datafiles is not going to give you much more speed if they are going against the same controller. As a matter of fact it may even slow you down.

If your C/D are on a separate controller from the E, then run your db files to c/d and your logs to e.

Just how I see it....|||on a 3-disk raid-5 there are at least 3 physical io's, vs. raid-1 yields only 2. raid-1 is actually faster, and for a 3-disk raid-5 it provides a better level of reliability than raid-5.|||It also sounds like your OS is on a RAID 5. If the OS is a raid 5, then the swap file is probably there as well, so the system has to calcualte parity for the swap file space every time it writes to the virtual memory. This can hammer you performace really, really badly.
I would recomment installing a separate partition or disk, separate from the RAID arrays, at least any with parity, and point the vitual memory swap file to live there.|||Originally posted by ms_sql_dba
on a 3-disk raid-5 there are at least 3 physical io's, vs. raid-1 yields only 2. raid-1 is actually faster, and for a 3-disk raid-5 it provides a better level of reliability than raid-5.

Yeah, RAID 1 is definately faster, and more space efficient. But it doesn't provide you with any failsafe. If you lose one of the drives all your data is gone. The point of RAID5 is to give you redundancy without too much extra space overhead and access times. If you lose a RAID 5 drive, your data is still in tact.|||Originally posted by ms_sql_dba
on a 3-disk raid-5 there are at least 3 physical io's, vs. raid-1 yields only 2. raid-1 is actually faster, and for a 3-disk raid-5 it provides a better level of reliability than raid-5.

It depends on the SCSI controller and drives being optimized. If so the the three physical I/O's should be running concurrently (write/3 plus parity) vs. RAID 1 being paralell(sp?).

I grant speed diff of RAID 5 on 3 disk is questionable depending on HW, Firmware (FW), and SW vs RAID 1. Running RAID 5 on 5+ disks will generally show the difference.

Another point in RAID 5 favor though depending on OS/FW though is continued expandability. I think the high end SCSI size is about 320GB and you are talking big $ per. In RAID 1 you are limited to that, unless you do RAID 10. but then you might as well look at SAN/NAS. With a good RAID 5 you can add drives to the limit of the controllers.

But if he is trying to run those databases and transactions on a 2x500Mz with 2G RAM a FastSCSI 1 controller and a 10/100NIC, he might as well be trying to pump out of the Pacific using a hand bailing pump and a straw.|||Originally posted by craigmc
Yeah, RAID 1 is definately faster, and more space efficient. But it doesn't provide you with any failsafe. If you lose one of the drives all your data is gone. The point of RAID5 is to give you redundancy without too much extra space overhead and access times. If you lose a RAID 5 drive, your data is still in tact.

RAID 0 = Striping
RAID 1 = Mirroring
RAID 5 = Striping w/ partity
RAID 10 = Striping w or w/o parity and mirroring (generally with parity)|||craig, in raid-1 if I lose one of the drives, the other will have an exact copy of the data up to the point of failure of the first drive.

also, it becomes a defacto standard for small to medium size business that do not utilize san technology, to put the data on raid-5 with 5+ drives spread through 2+ scsi controllers, logs on raid-1, also multiple controllers to allow for semi-parallel io. I also recommend to put tempdb on a separate controller, maybe even 1 or 2 non-raided physical disks. AND i appload your suggestion on the location of the swap file.|||and watch paging (total pages), cpu (total pct) and logical disk (avg disk queue length).

This will give you a good first approximation at whether you problem is not enough memory (too much paging), not enough processor (cpu utilization) or insufficient disk performance.

Microsoft recommends that paging be kept below 20 pages/sec.
I suggest if the CPU(s) are spending a lot of time above 80%, you might need faster or more processors.
But I suspect you're problem is disk performance. The thing that REALLY leaps out at me is you have both the transaction logs and data on the same drive. That's nearly always a bad idea, unless you are just too constrained by money to have it any other way. Probably nothing will speed up your response time more - particularly if you're doing a lot of data changes as opposed to mostly SELECTS - than separating the log from the data files.

No doubt, RAID 1 is faster than RAID 5. RAID 10, if you can afford it, is even better. Sometimes it's hard to convince people of this, but RAID 5 is SLOW to write data. So if you're doing a lot of INSERTS, UPDATES or DELETES, get off of a RAID 5 array, it's probably killing you.

ALL THAT SAID, keep in mind that throwing hardware at a performance problem is the last avenue you should persue. If you have performance issues, first TUNE YOUR SQL. One frequent but inefficient query can cost a lot of machine performance. Next, TUNE YOUR DATABASE. Do you have the proper indexes to make your queries execute quickly? Finally, TUNE THE ENGINE. SQL 2000 does a good job of this by itself, but there may be parms you can change to improve it even further. These three steps, particularly the first two, at least have the potential to give you far greater performance gains than any amount of hardware. Only after you've convinced yourself that the common queries are sensibly formuated and have the proper indexes should you even consider spending money on faster hardware. Better hardware probably won't gain you more than 2x to 10x your current speed, but tuning can literally get you gains of 100x - 10000x.

If necessary, use the Profiler to analyzer incoming queries. Take the frequent ones, paste them into Query Analyzer and take a look at the Query plans and I/O costs. Build (or add columns to) indexes to eliminate as many of the table and index scans as you can and get the engine doing index seeks. Time and time again I've seen a little work here yield eye-poping gains.|||Originally posted by ms_sql_dba
craig, in raid-1 if I lose one of the drives, the other will have an exact copy of the data up to the point of failure of the first drive.


Ahh yes, I got my 0 and 1 confused. Sorry for adding my confusion to the discussion.|||Originally posted by craigmc
Ahh yes, I got my 0 and 1 confused. Sorry for adding my confusion to the discussion.

No problem. You are right about the swap file.

Until we know a little more about the server he's running on we're just shooting in the dark as to disk access being the problem.

And tuning can help, but some of us are really maintaining databases that are for applications delivered by vendors. We had one application delivered by a vendor that the EOD process had climbed north of 6 hours over time as the data increased. After complaining enough they finally delivered the next version which dropped it to 30 minutes with the same amount of data. They didn't change the database at all, they tuned their queries, which we couldn't touch. Have no doubt, some vendors can't program to find the hole in their backside with a data map, specific queries, and the light of open source code.

Paraphrasing Hanley "A 23GB SQL database and adding approximately 83,000 records daily." With probably mass selects for the IIS.

My company had less transactions and sub 10GB DB's which forced us off a dual 550 1G Ram onto a onto a hyperthreaded dual 2.2Xeon with 2GB ram and a connection to a SAN. Giant leaps.|||Guys

Sorry its taken so long to reply, I've been on a course the last 2 days and haven't been in the office all week.

I will check the spec of the servere tomorrow but off the top of my head it is as follows:

4 x 1.5Ghz CPU
4GB physical memory (SQL limited to 2GB as only running Std edition)

Will have to check the details of the RAID controller

Can I just say that CPU resource on the server is never really above 20-30% so I didn't think this is the bottleneck

Will check the specs of hard disk etc tomorrow

Cheers

Hanley|||Originally posted by Hanley
I will check the spec of the servere tomorrow but off the top of my head it is as follows:

4 x 1.5Ghz CPU
4GB physical memory (SQL limited to 2GB as only running Std edition)

Will have to check the details of the RAID controller


That isn't too bad - details on the controller would be good. What's your backbone like. Hubs? Switched? Fiber, 10, 10/100?

Just pointing out other things to look at.

Possible in ActiveReports but not in RS?

What I was talking about is storing the "individual" pages of the
report's intermediate format into a database. Those pages can be
selected, tagged and exported not just to PDF by to any of the other
formats that AR supports. Bryan's response made it appear as if I was
talking about PDF pages which is not the case. Storing the individual
PDF pages is useful in some cases but not as powerful as the scenario
that I mentioned. The scenario I mention can be used to export PDF
pages, html or send faxes in Tiff Formats. MSRS stores the report
intermediate format into a database also, however, the whole report is
stored and cannot be exported as individual pages.
To: James L.: The idea is that you generate a large report once then
retrieve the pages as needed, this can be used where the original data
might have been archived and is no longer available. In case of
statements or old student records and such.
Bryan Keller [MSFT] wrote:
> Reporting Services can easily be used to render single pages of a
report to
> individual PDF files and then those files can be stored in a database
with
> attribution. It is a function of Acrobat and not of the report writer
to
> then merge the PDF files into a single document. With Reporting
Services and
> some custom code, you can meet that scenario hands down. That
functionality
> can probably be accomplished with a number of reporting tools,
Reporting
> Services absolutely being one of the them. Bottom line, Reporting
Services
> can be made to work in that scenario.
>
> --
> Bryan Keller
> Developer Documentation
> SQL Server Reporting Services
>
> A friendly reminder that this posting is provided "AS IS" with no
> warranties, and confers no rights.
>
>
> "Adnan Masood" <adnan.masood@.nextestate.com> wrote in message
> news:urS72IyGEHA.1240@.TK2MSFTNGP10.phx.gbl...
> >
> > Issam Elbaytam, Product Manager Active Reports posted a brief
comparative
> > analysis among Active Reports & MS Reporting Services in his blog
here.
> > http://blogs.pingpoet.com/thoughtchain/archive/2004/04/02/504.aspx
> >
> > Defining the following scenario, he claims this can't be
implemented in
> any
> > other reporting solution.
> >
> > "We have a customer that wanted to print a set of monthly
statements
> (>1000
> > pages) and needed to export these statements (individually) to PDF
based
> > on the users request. He ended up generating the report output
then
> taking
> > the individual pages (using our RDF file format) and inserting each
page
> as
> > a record in a database with some keywords. Now his users can use
his app
> > to pick and choose which of those pages they need. The app would
then
> grab
> > the selected pages from the database, merge them into a new RDF
document
> and
> > export the document to PDF (or any other format) as if it was run
that
> > instant. You simply cannot do that in any other report writer"
> >
> > I'm pretty sure RS has capabilites to implement but would like to
know the
> > opinions of others in the group.
> >
> > Thanks.
> >
> > Adnan Masood
> >
> >Reporting Services already does this but it is not based on pages since
pagination is rendering specific. We do "chunk" our intermediate format as
it is written to the database. We are looking at making this chunking more
granular to improve stream effeciency in future releases.
--
Brian Welcker
Group Program Manager
Microsoft SQL Server Reporting Services
This posting is provided "AS IS" with no warranties, and confers no rights.
<issamelbaytam@.hotmail.com> wrote in message
news:cbaksb$dhf@.odah37.prod.google.com...
> What I was talking about is storing the "individual" pages of the
> report's intermediate format into a database. Those pages can be
> selected, tagged and exported not just to PDF by to any of the other
> formats that AR supports. Bryan's response made it appear as if I was
> talking about PDF pages which is not the case. Storing the individual
> PDF pages is useful in some cases but not as powerful as the scenario
> that I mentioned. The scenario I mention can be used to export PDF
> pages, html or send faxes in Tiff Formats. MSRS stores the report
> intermediate format into a database also, however, the whole report is
> stored and cannot be exported as individual pages.
> To: James L.: The idea is that you generate a large report once then
> retrieve the pages as needed, this can be used where the original data
> might have been archived and is no longer available. In case of
> statements or old student records and such.
>
> Bryan Keller [MSFT] wrote:
>> Reporting Services can easily be used to render single pages of a
> report to
>> individual PDF files and then those files can be stored in a database
> with
>> attribution. It is a function of Acrobat and not of the report writer
> to
>> then merge the PDF files into a single document. With Reporting
> Services and
>> some custom code, you can meet that scenario hands down. That
> functionality
>> can probably be accomplished with a number of reporting tools,
> Reporting
>> Services absolutely being one of the them. Bottom line, Reporting
> Services
>> can be made to work in that scenario.
>> --
>> Bryan Keller
>> Developer Documentation
>> SQL Server Reporting Services
>> A friendly reminder that this posting is provided "AS IS" with no
>> warranties, and confers no rights.
>>
>> "Adnan Masood" <adnan.masood@.nextestate.com> wrote in message
>> news:urS72IyGEHA.1240@.TK2MSFTNGP10.phx.gbl...
>> >
>> > Issam Elbaytam, Product Manager Active Reports posted a brief
> comparative
>> > analysis among Active Reports & MS Reporting Services in his blog
> here.
>> > http://blogs.pingpoet.com/thoughtchain/archive/2004/04/02/504.aspx
>> >
>> > Defining the following scenario, he claims this can't be
> implemented in
>> any
>> > other reporting solution.
>> >
>> > "We have a customer that wanted to print a set of monthly
> statements
>> (>1000
>> > pages) and needed to export these statements (individually) to PDF
> based
>> > on the users request. He ended up generating the report output
> then
>> taking
>> > the individual pages (using our RDF file format) and inserting each
> page
>> as
>> > a record in a database with some keywords. Now his users can use
> his app
>> > to pick and choose which of those pages they need. The app would
> then
>> grab
>> > the selected pages from the database, merge them into a new RDF
> document
>> and
>> > export the document to PDF (or any other format) as if it was run
> that
>> > instant. You simply cannot do that in any other report writer"
>> >
>> > I'm pretty sure RS has capabilites to implement but would like to
> know the
>> > opinions of others in the group.
>> >
>> > Thanks.
>> >
>> > Adnan Masood
>> >
>> >
>

Possible help with query...

Ok, I have a parent table that has a child table, it is a 1-Many relationship. I want to select the single newest child from the child table with the parent table information for each parent value in the table (so if there is a parent record, i only want the newest child record). I have it working, but I am not satisfied with how I have written it. I wanted to see if there were any other opinions on the matter. The way I have it written there *could* be duplicates if somehow the date/time stamps are the same on the created column


here is an example:

DECLARE @.ParentTableTABLE ( ParentIdUNIQUEIDENTIFIER , TitleNVARCHAR(20) )
DECLARE @.ChildTableTABLE ( ChildIdUNIQUEIDENTIFIER , ParentIdUNIQUEIDENTIFIER , TitleNVARCHAR(20) , CreatedDATETIME )

INSERT INTO @.ParentTable ( ParentId , Title )VALUES ('00000000-0000-0000-0000-000000000001' ,'Option A' )
INSERT INTO @.ParentTable ( ParentId , Title )VALUES ('00000000-0000-0000-0000-000000000002' ,'Option B' )
INSERT INTO @.ParentTable ( ParentId , Title )VALUES ('00000000-0000-0000-0000-000000000003' ,'Option C' )

INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('10000000-0000-0000-0000-000000000001' ,'00000000-0000-0000-0000-000000000001' ,'Child A - 1' ,'5/4/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('10000000-0000-0000-0000-000000000002' ,'00000000-0000-0000-0000-000000000001' ,'Child A - 2' ,'5/3/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('10000000-0000-0000-0000-000000000003' ,'00000000-0000-0000-0000-000000000001' ,'Child A - 3' ,'5/2/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('10000000-0000-0000-0000-000000000004' ,'00000000-0000-0000-0000-000000000001' ,'Child A - 4' ,'5/1/2007 4:00:00 PM' )

INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('20000000-0000-0000-0000-000000000001' ,'00000000-0000-0000-0000-000000000002' ,'Child B - 1' ,'5/1/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('20000000-0000-0000-0000-000000000002' ,'00000000-0000-0000-0000-000000000002' ,'Child B - 2' ,'5/10/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('20000000-0000-0000-0000-000000000003' ,'00000000-0000-0000-0000-000000000002' ,'Child B - 3' ,'5/2/2007 4:00:00 PM' )

INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('30000000-0000-0000-0000-000000000001' ,'00000000-0000-0000-0000-000000000003' ,'Child C - 1' ,'5/1/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('30000000-0000-0000-0000-000000000002' ,'00000000-0000-0000-0000-000000000003' ,'Child C - 2' ,'5/2/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('30000000-0000-0000-0000-000000000003' ,'00000000-0000-0000-0000-000000000003' ,'Child C - 3' ,'5/3/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('30000000-0000-0000-0000-000000000004' ,'00000000-0000-0000-0000-000000000003' ,'Child C - 4' ,'5/4/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('30000000-0000-0000-0000-000000000005' ,'00000000-0000-0000-0000-000000000003' ,'Child C - 5' ,'5/5/2007 4:00:00 PM' )
INSERT INTO @.ChildTable ( ChildId , ParentId , Title , Created )VALUES ('30000000-0000-0000-0000-000000000006' ,'00000000-0000-0000-0000-000000000003' ,'Child C - 6' ,'5/6/2007 4:00:00 PM' )

/**
I want to select the information from the child table and the parent table,
selecting ONLY the oldest record. There should only be 3 results.
**/
SELECT
c.ParentId ,
p.TitleAS ParentTitle ,
c.ChildId ,
c.TitleAS ChildTitle ,
c.Created
FROM
@.ChildTable c
INNERJOIN @.ParentTable pON c.ParentId = p.ParentId
INNERJOIN
(
SELECT
subc.ParentId ,
MAX(subc.Created)AS Created
FROM
@.ChildTable subc
INNERJOIN @.ParentTable subptON subc.ParentId = subpt.ParentId
GROUP BY
subc.ParentId
)AS SubPTON p.ParentId = SubPT.ParentIdAND SubPT.Created = c.Created

here are the expected results:

ParentId ParentTitle ChildId ChildTitle Created
----------- ------- ----------- ------- --------
00000000-0000-0000-0000-000000000001 Option A 10000000-0000-0000-0000-000000000001 Child A - 1 2007-05-04 16:00:00.000
00000000-0000-0000-0000-000000000002 Option B 20000000-0000-0000-0000-000000000002 Child B - 2 2007-05-10 16:00:00.000
00000000-0000-0000-0000-000000000003 Option C 30000000-0000-0000-0000-000000000006 Child C - 6 2007-05-06 16:00:00.000

Anyone have any ideas?

I suggest that you change ParentId and ChildId to be identity integer (Add a UNIQUEIDENTIFIER if need be). The child table can now be sorted in time sequence by sorting on Id.|||I added an "Alt" id to it using bigint, but i wanted to keep with Guid's for several design considerations. I could change it and might look into it. I know of "reverse grouping" using the primary key when it is an integer type (using the same premise of MAX(PrimaryKey) and grouping by the parent object.|||

double post ;-(

|||
SELECT *FROM (SELECT c.ParentId,p.TitleAS ParentTitle,c.ChildId,c.TitleAS ChildTitle,c.Created,row_number()over(PARTITIONBY c.ParentIDORDER BY CreatedDESC,ChildIDDESC)AS tFROM @.ChildTable cINNERJOIN @.ParentTable pON c.ParentId = p.ParentId) t1WHERE t=1
|||

So, I went to try and do this as a learning experiment... I am no SQL Guru or anything, but I thought I was ok. This excersixe showed me I have lots to learn :-)

This is what I came up with, it is probably worse than yours. It is actually more code, and seems a lot worse. But I think I had some creative ideas at least :-)
The following works using 2 extra temp tables ( This is what you are doing in your sub select inner joins, I tried to find a cleaner way, in the end I think your way beat mine)

-- The workItems table actually has the three rows we want, slected from the ordered child table. But I did not know how to join with the groups, and sub queris. So again... I think yours beat mine by a lot, but I am curious what other sorts of results people might come up with.

DECLARE @.orderdChildTableTABLE ( ChildIdUNIQUEIDENTIFIER , ParentIdUNIQUEIDENTIFIER , TitleNVARCHAR(20) , CreatedDATETIME )INSERT INTO @.orderdChildTableSELECT c.ChildId, c.ParentID, c.Title, c.CreatedFROM @.ChildTable cORDER BY c.CreatedDESC
DECLARE @.workItemsTABLE (ParentIdUNIQUEIDENTIFIER, CreatedDATETIME)INSERT INTO @.workItemsSelect oct.ParentID,max(oct.Created)From @.orderdChildTable octGROUP BY oct.ParentIDSelect oct.ParentId, p.titleas ParentTitle, oct.ChildId, oct.TitleAS ChildTitle, oct.CreatedFrom @.workItems wiINNERJOIN @.orderdChildTable octON (wi.ParentID = oct.ParentID)AND (wi.Created = oct.Created)LEFTJOIN @.ParentTable pon p.ParentID = wi.ParentIDORDER BY ParentIDASC

|||

Motley, your query certainly looks cleaner than either of ours. (But you do return an extra recordset... which is not really important nor does it bother me).

Can you perhaps elaborate a little, breaking down the query, as to what pieces is doing what. This looks like sql2005 specific stuff, that I am not familiar with. and perhaps others could use a small break down, which helps guide me as to what to read/study up on in my books and google.

I think the piece that I am confused/not familar with is how do I read this statment to understand what exactly it is doing:

row_number

()over(PARTITIONBY c.ParentIDORDERBY CreatedDESC,ChildIDDESC)AS t

I am also a little thrown by the double From...

SELECT

*
FROM( .... ()AS tFROM @.ChildTable c
INNERJOIN @.ParentTable pON c.ParentId= p.ParentId
) t1
Where t = 1

Im not sure what t1 is either, and what it is supposidly emitting.

|||

Motley:

SELECT *
FROM (
SELECT c.ParentId,p.TitleAS ParentTitle,c.ChildId,c.TitleAS ChildTitle,c.Created,row_number()over(PARTITIONBY c.ParentIDORDER BY CreatedDESC,ChildIDDESC)AS t
FROM @.ChildTable c
INNERJOIN @.ParentTable pON c.ParentId = p.ParentId
) t1
WHERE t=1

ah there it is. awesome. i ran into this problem before 2005 and didn't know how to get around it. thanks. i have used the row_number function before, i don't know why i didn't think of it here|||

The query doesn't actually return another recordset, it's a subquery.

Start with:

SELECT c.ParentId,p.TitleAS ParentTitle,c.ChildId,c.TitleAS ChildTitle,c.Created,row_number()over(PARTITIONBY c.ParentIDORDER BY CreatedDESC,ChildIDDESC)AS tFROM @.ChildTable cINNERJOIN @.ParentTable pON c.ParentId = p.ParentId

the row_number portion of that is telling sqlserver to create a column in the result named "t", and for each row within a "c.ParentID" that is ordered by Created and order by ChildID within that it should represent it's place/rank (It's similiar to rank, but not quite). So the highest Created value for each ParentID will have a value of 1, the second highest is 2, etc etc but when there is a tie with two records of the same ParentID, and Created then it will consider the one with the highest ChildID to be first, basically breaking the tie.

The outer query... SELECT * FROM ( {inner query here} ) t1 WHERE t=1 just says to only take the records where t=1.

t1 is what I am calling the subquery. You always have to give subqueries that are part of the FROM clause an alias.

|||

jminond:

Motley, your query certainly looks cleaner than either of ours. (But you do return an extra recordset... which is not really important nor does it bother me).

Can you perhaps elaborate a little, breaking down the query, as to what pieces is doing what. This looks like sql2005 specific stuff, that I am not familiar with. and perhaps others could use a small break down, which helps guide me as to what to read/study up on in my books and google.

I think the piece that I am confused/not familar with is how do I read this statment to understand what exactly it is doing:

row_number()over(PARTITIONBY c.ParentIDORDERBY CreatedDESC,ChildIDDESC)AS t

I am also a little thrown by the double From...

SELECT*
FROM( .... ()AS tFROM @.ChildTable c
INNERJOIN @.ParentTable pON c.ParentId= p.ParentId
) t1
Where t = 1

Im not sure what t1 is either, and what it is supposidly emitting.

The two froms come from the fact you are selecting from the inner result set.

the partition is basically a "grouping" in the rownumber function, it tells the result set to number the results based on the parentid field, thus autonumbering the children 1-X based on the parentid.

t1 refers to the "temporary result set" from the select statement in the ( ), you could also have done "AS t1" to be more direct about what you are doing

and the where t=1 tells the query you only want the results where your temporary identity is 1, thus the newest child row for each parent row

if any of that makes sense

Possible Hack Attempt - Guidence Needed

Dear All,
I run a trace what checks for "Audit Login Fail". To be honest I set it up
out of Interest, never expecting anything to come of it. Yesterday I was on
holiday and got in today to find that I have had some audit failures.
They go something like this:-
Login failed for user 'sa'
Login failed for user 'admin'
Login failied for user 'probe'
Login failed for user 'sql'
Login failed for user 'Nessus-Test-User'
There are clusters of these which are done within a second.
Therefore I have reported an attempted hack. Can anyone tell me if I have
jumped the gun here, and if not what can I do to trace the hacker.
Thanks
PeterDid you also check to see if any security scanning tools
were run on the server? Something like MBSA?
Generally there is an entry in the Windows event log when
running security scanning tools.
You can use a network sniffer to track login attempts.
You can find more information on securing SQL Server at the
SQL Server security center site:
[url]http://www.microsoft.com/sql/techinfo/administration/2000/security/default.asp[/ur
l]
-Sue
On Tue, 12 Apr 2005 03:13:02 -0700, "Peter Nolan"
<PeterNolan@.discussions.microsoft.com> wrote:

>Dear All,
>I run a trace what checks for "Audit Login Fail". To be honest I set it up
>out of Interest, never expecting anything to come of it. Yesterday I was on
>holiday and got in today to find that I have had some audit failures.
>They go something like this:-
>Login failed for user 'sa'
>Login failed for user 'admin'
>Login failied for user 'probe'
>Login failed for user 'sql'
>Login failed for user 'Nessus-Test-User'
>There are clusters of these which are done within a second.
>Therefore I have reported an attempted hack. Can anyone tell me if I have
>jumped the gun here, and if not what can I do to trace the hacker.
>Thanks
>Peter|||Thanks for your input Sue.
Unfortunatly I do not have any server access, and the person I needed to
speak to (Head of IT Security) is on holiday.
I have done some research and found that it was indeed a hack attempt (have
a look at http://www.nessus.org) however they could not actually get into th
e
database as the sa password is too tight. Thank you for the link to SQL
Security I have been though that before and have already put in the
requirements.
The though is that our Parent Company sometimes do security audits and this
maybe one of those, but we will know more tomorrow.
Anyway thank you again for your response.
Peter
"Sue Hoegemeier" wrote:

> Did you also check to see if any security scanning tools
> were run on the server? Something like MBSA?
> Generally there is an entry in the Windows event log when
> running security scanning tools.
> You can use a network sniffer to track login attempts.
> You can find more information on securing SQL Server at the
> SQL Server security center site:
> [url]http://www.microsoft.com/sql/techinfo/administration/2000/security/default.asp[/
url]
> -Sue
> On Tue, 12 Apr 2005 03:13:02 -0700, "Peter Nolan"
> <PeterNolan@.discussions.microsoft.com> wrote:
>
>

Possible Fix SQL Server 2005 Express Install Errors 29503 or 1067

Hello,

I spent days trying repeatedly to install SQL Server 2005 Express edition and always received an error message. I uninstalled SQL Server 2000, cleaned the registry, uninstalled/reinstalled Visual Basic 2005 Express, read all the forums and tried every combination. I always received this failure message:

The SQL Server service failed to start. For more information, see the SQL Server Books Online topics, "How to: View SQL Server 2005 Setup Log Files" and "Starting SQL Server Manually." The error is (1067) The process terminated unexpectedly. Error Number : 29503

If this is the particular error you are receiving, then maybe this post will help.

The solution for me was NOT to uninstall Visual Basic 2005 Express Edition, or install SQL Server 2005 in any particular order vs installing VB 2005 Express.

These are the steps that worked for me. Run setup for SQL Server 2005 Express. On the first setup screen, there is a checkbox that says "Hide advanced setup/installation options" (or somesuch). It is checked by default. UNCHECK IT. (You DO want to select advanced options because the defaults are not working for you.)

Look closely at the screens that follow. Look for the dropdown that says "Install SQL Server as a... Network Service. In the dropdown, change the choice to Local Service.

In another screen, look for the Authentication method. Do NOT use Windows authentication, which is the default. Select Mixed Mode authentication, and provide a strong password.

Now, let it install. For me, it no longer failed. SQL Server installed with no further problems.

Hope this helps somebody (including MS in figuring out how come). It worked for me when nothing else did!

SQL Wizard (who?)

SQl Wizard - I am having this exact same issue on my win xp pro sp2 machine. However, I tried what you suggested and still got the same error. I also had sql 2000 installed prior (uninstalled it prior to installing sql 2005) - did you clean up anything in the registry before you started?

Thanks for your help.....

csqlrun

Possible DTS File Import Bug

Env: SQL Server 2000 on in WIN NT 5.x
Job: import mutiple flat files into several tables daily.
Catch: one or two of the several flat files might be empty.

First thought/test:
Use [first row as fields] option for the import process.
Problem, DTS can't complete (as a package).

As an alternative, I could probably detect if a file is empty then
decide what to do with it, with VB activeX, it might be feasible,
question, VB has a command for "FileExist", how about "FileLen" or the
like for determining the length of a file?

TIA.Hi

See http://www.sqldts.com/default.aspx?292 and
http://www.sqldts.com/default.aspx?246

John

"NickName" <dadada@.rock.com> wrote in message
news:1102971903.347486.145350@.f14g2000cwb.googlegr oups.com...
> Env: SQL Server 2000 on in WIN NT 5.x
> Job: import mutiple flat files into several tables daily.
> Catch: one or two of the several flat files might be empty.
> First thought/test:
> Use [first row as fields] option for the import process.
> Problem, DTS can't complete (as a package).
> As an alternative, I could probably detect if a file is empty then
> decide what to do with it, with VB activeX, it might be feasible,
> question, VB has a command for "FileExist", how about "FileLen" or the
> like for determining the length of a file?
> TIA.|||Very helpful. Thank you.|||Very helpful. Thank you. However, activeX problem, error obj "string
C:\myDir\file1.csv" required, the code seems to be correct.

' File Size
' check file size if empty | 0 quit

Option Explicit

Function Main()

Dim oFSO
Dim oFile
Dim sSourceFile

Set oFSO = CreateObject("Scripting.FileSystemObject")

Set sSourceFile = "C:\myDir\file1.csv"

Set sSourceFileV = sSourceFile.Value

Set oFile = oFSO.GetFile(sSourceFileV)

If oFile.Size > 0 Then
Main = DTSTaskExecResult_Success
Else
Main = DTSTaskExecResult_Failure
End If
' Clean Up
Set oFile = Nothing
Set oFSO = Nothing
End Function|||Never mind about the activeX error, the objFile.Value attribute was
unnecessary. But the "connectors" does not seem to have an option to
connect this ActiveX script with a package and making sure run the
ActiveX script first, instead of last.

TIA.|||Hi

You can use workflow to govern the order of the steps, if you had one or
more files to process then you can use the looping example
http://www.sqldts.com/default.aspx?246, although I would expect a three way
split for in the shouldILoop procedure or a second comparison step to cater
for files with size, files with no size and no files to process.

In your code sSourceFileV is not needed, use
Set oFile = oFSO.GetFile(sSourceFile)

John

"NickName" <dadada@.rock.com> wrote in message
news:1103038306.437005.86870@.f14g2000cwb.googlegro ups.com...
> Very helpful. Thank you. However, activeX problem, error obj "string
> C:\myDir\file1.csv" required, the code seems to be correct.
> ' File Size
> ' check file size if empty | 0 quit
> Option Explicit
> Function Main()
> Dim oFSO
> Dim oFile
> Dim sSourceFile
> Set oFSO = CreateObject("Scripting.FileSystemObject")
> Set sSourceFile = "C:\myDir\file1.csv"
> Set sSourceFileV = sSourceFile.Value
> Set oFile = oFSO.GetFile(sSourceFileV)
> If oFile.Size > 0 Then
> Main = DTSTaskExecResult_Success
> Else
> Main = DTSTaskExecResult_Failure
> End If
> ' Clean Up
> Set oFile = Nothing
> Set oFSO = Nothing
> End Function|||Hi

You can use workflow to determine the order of execution.

John

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

possible currupt database?

Hello everyone,
I recently moved a database from a Windows 2000 server (standard)
running SQL 2000 SP3 and placed it on a Windows 2003 server (standard) also
running SQL 2000 SP3. For some weird reason, on our citrix server which is
a BL20P (HP Blade) when my clients access their session and run the
application that is tied into the database that I just moved from one server
to the other, when they are prompted to either cancel or save their work, it
freezes and it doesn't save their work. Prior of moving this database, they
didn't have this problem, the other weird thing is, other people (normal
workstation computers) when they access the same program it works fine and
there are no hang ups. This program is a proprietary program that we use to
issue permits and it's made by HDL companies. I've had no luck with them
because they are pointing figures at me. My question, is it possible that
during the transportation of the database, and could it have been corrupted
during the process? Is there a way to do a consistency check on the
database? Help...
Thank you
Alex AndersonAlex,
Yes, there are tools to do consistency checks. Read about DBCC CHECKDB in
the BOL.
However, if your other non-HDL users are working, then I suspect the
database is probably not corrupted. I would suggest that you recheck the
security issues (grants, usernames, etc) that the HDL documentation should
specify. It may be that an account or account rights got missed during the
move.
Russell Fields
"Alex Anderson" <AAnderson@.Murrieta.org> wrote in message
news:uKw5$3cQEHA.132@.TK2MSFTNGP09.phx.gbl...
> Hello everyone,
> I recently moved a database from a Windows 2000 server (standard)
> running SQL 2000 SP3 and placed it on a Windows 2003 server (standard)
also
> running SQL 2000 SP3. For some weird reason, on our citrix server which
is
> a BL20P (HP Blade) when my clients access their session and run the
> application that is tied into the database that I just moved from one
server
> to the other, when they are prompted to either cancel or save their work,
it
> freezes and it doesn't save their work. Prior of moving this database,
they
> didn't have this problem, the other weird thing is, other people (normal
> workstation computers) when they access the same program it works fine and
> there are no hang ups. This program is a proprietary program that we use
to
> issue permits and it's made by HDL companies. I've had no luck with them
> because they are pointing figures at me. My question, is it possible
that
> during the transportation of the database, and could it have been
corrupted
> during the process? Is there a way to do a consistency check on the
> database? Help...
> Thank you
> Alex Anderson
>|||Russell,
Thank you for responding. I have logged in the same user that logs into
citrix where the problem exists on a workstation and that same user doesn't
have the problem. Security I have specified are two groups and there is
another SQL security user but that is for administration purposes. I'll
read up on DBCC and CHECKDB. It's the strangest problem I've ran into, and
I have everyone involved except Microsoft which would be my last ditch
effort.
Thank you
Alex Anderson
"Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
news:OmZCaGdQEHA.1392@.TK2MSFTNGP09.phx.gbl...
> Alex,
> Yes, there are tools to do consistency checks. Read about DBCC CHECKDB in
> the BOL.
> However, if your other non-HDL users are working, then I suspect the
> database is probably not corrupted. I would suggest that you recheck the
> security issues (grants, usernames, etc) that the HDL documentation should
> specify. It may be that an account or account rights got missed during
the
> move.
> Russell Fields
> "Alex Anderson" <AAnderson@.Murrieta.org> wrote in message
> news:uKw5$3cQEHA.132@.TK2MSFTNGP09.phx.gbl...
> also
> is
> server
work,[vbcol=seagreen]
> it
> they
and[vbcol=seagreen]
use[vbcol=seagreen]
> to
them[vbcol=seagreen]
> that
> corrupted
>

possible currupt database?

Hello everyone,
I recently moved a database from a Windows 2000 server (standard)
running SQL 2000 SP3 and placed it on a Windows 2003 server (standard) also
running SQL 2000 SP3. For some weird reason, on our citrix server which is
a BL20P (HP Blade) when my clients access their session and run the
application that is tied into the database that I just moved from one server
to the other, when they are prompted to either cancel or save their work, it
freezes and it doesn't save their work. Prior of moving this database, they
didn't have this problem, the other weird thing is, other people (normal
workstation computers) when they access the same program it works fine and
there are no hang ups. This program is a proprietary program that we use to
issue permits and it's made by HDL companies. I've had no luck with them
because they are pointing figures at me. My question, is it possible that
during the transportation of the database, and could it have been corrupted
during the process? Is there a way to do a consistency check on the
database? Help...
Thank you
Alex Anderson
Alex,
Yes, there are tools to do consistency checks. Read about DBCC CHECKDB in
the BOL.
However, if your other non-HDL users are working, then I suspect the
database is probably not corrupted. I would suggest that you recheck the
security issues (grants, usernames, etc) that the HDL documentation should
specify. It may be that an account or account rights got missed during the
move.
Russell Fields
"Alex Anderson" <AAnderson@.Murrieta.org> wrote in message
news:uKw5$3cQEHA.132@.TK2MSFTNGP09.phx.gbl...
> Hello everyone,
> I recently moved a database from a Windows 2000 server (standard)
> running SQL 2000 SP3 and placed it on a Windows 2003 server (standard)
also
> running SQL 2000 SP3. For some weird reason, on our citrix server which
is
> a BL20P (HP Blade) when my clients access their session and run the
> application that is tied into the database that I just moved from one
server
> to the other, when they are prompted to either cancel or save their work,
it
> freezes and it doesn't save their work. Prior of moving this database,
they
> didn't have this problem, the other weird thing is, other people (normal
> workstation computers) when they access the same program it works fine and
> there are no hang ups. This program is a proprietary program that we use
to
> issue permits and it's made by HDL companies. I've had no luck with them
> because they are pointing figures at me. My question, is it possible
that
> during the transportation of the database, and could it have been
corrupted
> during the process? Is there a way to do a consistency check on the
> database? Help...
> Thank you
> Alex Anderson
>
|||Russell,
Thank you for responding. I have logged in the same user that logs into
citrix where the problem exists on a workstation and that same user doesn't
have the problem. Security I have specified are two groups and there is
another SQL security user but that is for administration purposes. I'll
read up on DBCC and CHECKDB. It's the strangest problem I've ran into, and
I have everyone involved except Microsoft which would be my last ditch
effort.
Thank you
Alex Anderson
"Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
news:OmZCaGdQEHA.1392@.TK2MSFTNGP09.phx.gbl...
> Alex,
> Yes, there are tools to do consistency checks. Read about DBCC CHECKDB in
> the BOL.
> However, if your other non-HDL users are working, then I suspect the
> database is probably not corrupted. I would suggest that you recheck the
> security issues (grants, usernames, etc) that the HDL documentation should
> specify. It may be that an account or account rights got missed during
the[vbcol=seagreen]
> move.
> Russell Fields
> "Alex Anderson" <AAnderson@.Murrieta.org> wrote in message
> news:uKw5$3cQEHA.132@.TK2MSFTNGP09.phx.gbl...
> also
> is
> server
work,[vbcol=seagreen]
> it
> they
and[vbcol=seagreen]
use[vbcol=seagreen]
> to
them
> that
> corrupted
>

possible currupt database?

Hello everyone,
I recently moved a database from a Windows 2000 server (standard)
running SQL 2000 SP3 and placed it on a Windows 2003 server (standard) also
running SQL 2000 SP3. For some weird reason, on our citrix server which is
a BL20P (HP Blade) when my clients access their session and run the
application that is tied into the database that I just moved from one server
to the other, when they are prompted to either cancel or save their work, it
freezes and it doesn't save their work. Prior of moving this database, they
didn't have this problem, the other weird thing is, other people (normal
workstation computers) when they access the same program it works fine and
there are no hang ups. This program is a proprietary program that we use to
issue permits and it's made by HDL companies. I've had no luck with them
because they are pointing figures at me. My question, is it possible that
during the transportation of the database, and could it have been corrupted
during the process? Is there a way to do a consistency check on the
database? Help...
Thank you
Alex AndersonAlex,
Yes, there are tools to do consistency checks. Read about DBCC CHECKDB in
the BOL.
However, if your other non-HDL users are working, then I suspect the
database is probably not corrupted. I would suggest that you recheck the
security issues (grants, usernames, etc) that the HDL documentation should
specify. It may be that an account or account rights got missed during the
move.
Russell Fields
"Alex Anderson" <AAnderson@.Murrieta.org> wrote in message
news:uKw5$3cQEHA.132@.TK2MSFTNGP09.phx.gbl...
> Hello everyone,
> I recently moved a database from a Windows 2000 server (standard)
> running SQL 2000 SP3 and placed it on a Windows 2003 server (standard)
also
> running SQL 2000 SP3. For some weird reason, on our citrix server which
is
> a BL20P (HP Blade) when my clients access their session and run the
> application that is tied into the database that I just moved from one
server
> to the other, when they are prompted to either cancel or save their work,
it
> freezes and it doesn't save their work. Prior of moving this database,
they
> didn't have this problem, the other weird thing is, other people (normal
> workstation computers) when they access the same program it works fine and
> there are no hang ups. This program is a proprietary program that we use
to
> issue permits and it's made by HDL companies. I've had no luck with them
> because they are pointing figures at me. My question, is it possible
that
> during the transportation of the database, and could it have been
corrupted
> during the process? Is there a way to do a consistency check on the
> database? Help...
> Thank you
> Alex Anderson
>|||Russell,
Thank you for responding. I have logged in the same user that logs into
citrix where the problem exists on a workstation and that same user doesn't
have the problem. Security I have specified are two groups and there is
another SQL security user but that is for administration purposes. I'll
read up on DBCC and CHECKDB. It's the strangest problem I've ran into, and
I have everyone involved except Microsoft which would be my last ditch
effort.
Thank you
Alex Anderson
"Russell Fields" <RussellFields@.NoMailPlease.Com> wrote in message
news:OmZCaGdQEHA.1392@.TK2MSFTNGP09.phx.gbl...
> Alex,
> Yes, there are tools to do consistency checks. Read about DBCC CHECKDB in
> the BOL.
> However, if your other non-HDL users are working, then I suspect the
> database is probably not corrupted. I would suggest that you recheck the
> security issues (grants, usernames, etc) that the HDL documentation should
> specify. It may be that an account or account rights got missed during
the
> move.
> Russell Fields
> "Alex Anderson" <AAnderson@.Murrieta.org> wrote in message
> news:uKw5$3cQEHA.132@.TK2MSFTNGP09.phx.gbl...
> > Hello everyone,
> >
> > I recently moved a database from a Windows 2000 server (standard)
> > running SQL 2000 SP3 and placed it on a Windows 2003 server (standard)
> also
> > running SQL 2000 SP3. For some weird reason, on our citrix server which
> is
> > a BL20P (HP Blade) when my clients access their session and run the
> > application that is tied into the database that I just moved from one
> server
> > to the other, when they are prompted to either cancel or save their
work,
> it
> > freezes and it doesn't save their work. Prior of moving this database,
> they
> > didn't have this problem, the other weird thing is, other people (normal
> > workstation computers) when they access the same program it works fine
and
> > there are no hang ups. This program is a proprietary program that we
use
> to
> > issue permits and it's made by HDL companies. I've had no luck with
them
> > because they are pointing figures at me. My question, is it possible
> that
> > during the transportation of the database, and could it have been
> corrupted
> > during the process? Is there a way to do a consistency check on the
> > database? Help...
> >
> > Thank you
> > Alex Anderson
> >
> >
>

Possible Corrupt Table

I have 1 table in a 200+ table database
The database is Merge Synchornised and has been working fine for 2
years +
The same database is at several customers and the DB is fully
relational

I have a table which creates client timeout errors whenever an insert
or update is issued

The table has foreign keys and primary key and links parent to
children tables so if I need to recreate the table I will also need
advice on the best way to do this to keep the integrity of the
database

I wasn't sure the table was the problem so I deleted all publications
and disbled the server from being a distributor

I cannot find any error logs with any clues so can only assume the is
the first corruption I have ever seen on SQL 2K (SP3)

I have defragmented the drive, reindexed the tables, shrunk databases
(Plenty of space available)

Please advise any course of action you think may help me.

Regards Paul Goldney[posted and mailed, please reply in news]

paul goldney (paulg@.wizardit.co.uk) writes:
> I have a table which creates client timeout errors whenever an insert
> or update is issued

There is very little information to work from in your post.

If you really suspect corruption, run DBCC CHECKTABLE on the table.

However, I would suggest that there two other possibilities which
are much more likely:

1) There are triggers on the table, and which are poorly implemented
and takes long time to execute.
2) There is a blocking issue. The latter can be investigated by
running sp_who while waiting for the INSERT statement to complete.
If you see a non-zero value in the Blk column that column is blocking
the spid on that row.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

Possible Concurrency Issues.

Hi,
I was wondering if it is possible to call a stored procedure from sql server 2005 (call it sp_1) that calls an assembly which takes a message, wraps it in soap and calls a webservice and waits for a reply from that webservice (the stored procedure is clr not t-sql). This WebService needs to then call sp_1 to perform some other tasks. Is this possible or does sp_1 need to have finished what it was doing before it can be called again.

I have been trying to do this and have received a number of errors one of which looks like;

'The context connection is already in use.'

Sorry if I haven't worded it very well, I will try to clear up any questions if you need me to.

Thanks
N
Dont worry about replying to this, I have done some more research and the I was trying to open multiple context connections when only 1 is allowed at a time, which was creating the error.

Possible chain linkage problem

We're getting these errors when DBCC database check about a possible chain
linkage problem. We're thinking that there's something wrong with a pointer
from an text field to the data. We can do a db repair but problem will show
up elsewhere. This only happens on the replicated database btw.
I'll bet it with a rid on one of the merge replication system tables. I've
seen this happen before and what has worked for me is to do the repair. I'm
wondering if you aren't having other disk problems.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Marvin" <Marvin@.discussions.microsoft.com> wrote in message
news:C12AEAA4-C571-4DB7-A46B-187C47A869CF@.microsoft.com...
> We're getting these errors when DBCC database check about a possible chain
> linkage problem. We're thinking that there's something wrong with a
> pointer
> from an text field to the data. We can do a db repair but problem will
> show
> up elsewhere. This only happens on the replicated database btw.

Possible causes for "login failed for user 'xyz' "

What are some of the possible causes for the following error :
Login failed for user 'xyz' assuming we have mixed mode of authentication on
SQL 2K.
I can think of just an incorrect password entered. Are there any more
reasons ?I'd think there will be some more detail in the error message. Is that all,
you are getting?
--
HTH,
Vyas, MVP (SQL Server)
http://vyaskn.tripod.com/
What hardware is your SQL Server running on?
http://vyaskn.tripod.com/poll.htm
"Hassan" <fatima_ja@.hotmail.com> wrote in message
news:OeEWPGEkDHA.1808@.TK2MSFTNGP09.phx.gbl...
What are some of the possible causes for the following error :
Login failed for user 'xyz' assuming we have mixed mode of authentication on
SQL 2K.
I can think of just an incorrect password entered. Are there any more
reasons ?|||We can get authenticated for some apps. I just have profiled some failed
logins and wanted to know what may some reason be for that error message
seen in text data column
"Narayana Vyas Kondreddi" <answer_me@.hotmail.com> wrote in message
news:%23fvMIHMkDHA.2416@.TK2MSFTNGP10.phx.gbl...
> I'd think there will be some more detail in the error message. Is that
all,
> you are getting?
> --
> HTH,
> Vyas, MVP (SQL Server)
> http://vyaskn.tripod.com/
> What hardware is your SQL Server running on?
> http://vyaskn.tripod.com/poll.htm
>
>
> "Hassan" <fatima_ja@.hotmail.com> wrote in message
> news:OeEWPGEkDHA.1808@.TK2MSFTNGP09.phx.gbl...
> What are some of the possible causes for the following error :
> Login failed for user 'xyz' assuming we have mixed mode of authentication
on
> SQL 2K.
> I can think of just an incorrect password entered. Are there any more
> reasons ?
>
>|||"Hassan" <fatima_ja@.hotmail.com> wrote in message
news:OeEWPGEkDHA.1808@.TK2MSFTNGP09.phx.gbl...
> What are some of the possible causes for the following error :
> Login failed for user 'xyz' assuming we have mixed mode of authentication
on
> SQL 2K.
> I can think of just an incorrect password entered. Are there any more
> reasons ?
Connectivity. If it takes a while for the error to appear, consider your
connectivity options.
Kind Regards, Howard

Possible BUG: SP2 Printing Linked Reports in Report Manager

Hello,
I may have discovered a possible bug with the new SP2 Client Printing
capabilities.
To reproduce:
1) Create a report with margins of 0.3in top, left, bottom and right, and
page size of 8.5 x 11 (standard US letter size)
2) Deploy the report to Report Manager
3) Confirm that client printing respects margins set in report. Click on
the print button, click on Print Preview, click on Margins, you should now
see 0.3in margins all around
4) Create a linked report by going to Properties and clicking Create Link
Report button.
5) Click on the print button, click on Print Preview, click on Margins, you
should now see 0.5in margins all around, instead of 0.3in. This does not
occur while exporting linked report to PDF format.
I confirmed this behaviour numerous times, and it's not only with margin but
also with page-layout. Is there a work-around?
Many thanks,
EdgarThe work around is to call SetProperties for the linked report, setting the
PageHeight, PageWidth, TopMargin, BottomMargin, LeftMargin and RightMargin
to the values you wish. All values should be expressed in mm.
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"Edgar Allik" <Edgar Allik@.discussions.microsoft.com> wrote in message
news:0F3C537B-ECBF-4B69-BCE3-BFF8143B7BED@.microsoft.com...
> Hello,
> I may have discovered a possible bug with the new SP2 Client Printing
> capabilities.
> To reproduce:
> 1) Create a report with margins of 0.3in top, left, bottom and right, and
> page size of 8.5 x 11 (standard US letter size)
> 2) Deploy the report to Report Manager
> 3) Confirm that client printing respects margins set in report. Click on
> the print button, click on Print Preview, click on Margins, you should now
> see 0.3in margins all around
> 4) Create a linked report by going to Properties and clicking Create Link
> Report button.
> 5) Click on the print button, click on Print Preview, click on Margins,
> you
> should now see 0.5in margins all around, instead of 0.3in. This does not
> occur while exporting linked report to PDF format.
> I confirmed this behaviour numerous times, and it's not only with margin
> but
> also with page-layout. Is there a work-around?
> Many thanks,
> Edgar
>|||Hi, I am also experiencing the same error in SP2 where the page layout
properties don't seem to be set as they should be in a linked report.
I'm presuming "call SetProperties" is done via code - is there a work
around or way of doing this via Report Manager to fix this problem?
I'm not using any code with my reports, just URL access.
Thanks
Jeanine
Daniel Reib (MSFT) wrote:
> The work around is to call SetProperties for the linked report,
setting the
> PageHeight, PageWidth, TopMargin, BottomMargin, LeftMargin and
RightMargin
> to the values you wish. All values should be expressed in mm.
> -Daniel
> This posting is provided "AS IS" with no warranties, and confers no
rights.
> "Edgar Allik" <Edgar Allik@.discussions.microsoft.com> wrote in
message
> news:0F3C537B-ECBF-4B69-BCE3-BFF8143B7BED@.microsoft.com...
> > Hello,
> >
> > I may have discovered a possible bug with the new SP2 Client
Printing
> > capabilities.
> >
> > To reproduce:
> > 1) Create a report with margins of 0.3in top, left, bottom and
right, and
> > page size of 8.5 x 11 (standard US letter size)
> > 2) Deploy the report to Report Manager
> > 3) Confirm that client printing respects margins set in report.
Click on
> > the print button, click on Print Preview, click on Margins, you
should now
> > see 0.3in margins all around
> > 4) Create a linked report by going to Properties and clicking
Create Link
> > Report button.
> > 5) Click on the print button, click on Print Preview, click on
Margins,
> > you
> > should now see 0.5in margins all around, instead of 0.3in. This
does not
> > occur while exporting linked report to PDF format.
> >
> > I confirmed this behaviour numerous times, and it's not only with
margin
> > but
> > also with page-layout. Is there a work-around?
> >
> > Many thanks,
> >
> > Edgar
> >

possible bug: sp_attach_single_file_db

The scenario:
The server in question has its Default Log Directory set
to "L:\". I used sp_attach_single_file_db to attach an
mdf file. Here is the command that I used:
EXEC sp_attach_single_file_db @.dbname = 'test2',
@.physname = 'f:\test2_Data.MDF'
I received the following error:
Device activation error. The physical file
name 'L:\test2_Log.LDF' may be incorrect.
New log file 'L:\\test2_log.LDF' was created.
Notice the 2 back-slashes in the path of the log file! I
didn't notice it at first, and eventually attempted to
restore the database from a backup taken from a different
server, using a 'with move'. Here is the command that I
used for that:
RESTORE DATABASE test2
FROM DISK = 'x:\test2.bak'
with MOVE 'test2_Data' TO 'f:\SQL_Data\test2_Data.mdf',
MOVE 'test2_Log' TO 'L:\test2_Log.ldf'
and the response:
Server: Msg 3156, Level 16, State 1, Line 1
File 'test2_Log' cannot be restored to 'L:\test2_Log.ldf'.
Use WITH MOVE to identify a valid location for the file.
Server: Msg 3013, Level 16, State 1, Line 1
RESTORE DATABASE is terminating abnormally.
When I discovered the source of the problem, I inserted
the second back-slash in the move clause, and the restore
worked.
So, there may actually be 2 issues:
1. The extra back-slash that SQL expects to be in the path.
2. The erroneous error message associated with the restore
statement.
Has anyone else seen this?
Steve Phelps
SQL Server DBA
American Fidelity GroupThe extra backslash is due to a bug in EM in that if you use the browse
button to select the default directories it appends the trailing backslash
to the folder path, SQL actually expects the value to not have a trailing
backslash (easily ammended in EM by removing it although it generally
doesn't cause problems except the filename looking funny. Not come across
the restore issue, I'll try and repro that.
--
HTH
Jasper Smith (SQL Server MVP)
I support PASS - the definitive, global
community for SQL Server professionals -
http://www.sqlpass.org
"Steve Phelps" <Steve.Phelps@.af-group.com> wrote in message
news:014201c356ce$1fa90c50$a601280a@.phx.gbl...
The scenario:
The server in question has its Default Log Directory set
to "L:\". I used sp_attach_single_file_db to attach an
mdf file. Here is the command that I used:
EXEC sp_attach_single_file_db @.dbname = 'test2',
@.physname = 'f:\test2_Data.MDF'
I received the following error:
Device activation error. The physical file
name 'L:\test2_Log.LDF' may be incorrect.
New log file 'L:\\test2_log.LDF' was created.
Notice the 2 back-slashes in the path of the log file! I
didn't notice it at first, and eventually attempted to
restore the database from a backup taken from a different
server, using a 'with move'. Here is the command that I
used for that:
RESTORE DATABASE test2
FROM DISK = 'x:\test2.bak'
with MOVE 'test2_Data' TO 'f:\SQL_Data\test2_Data.mdf',
MOVE 'test2_Log' TO 'L:\test2_Log.ldf'
and the response:
Server: Msg 3156, Level 16, State 1, Line 1
File 'test2_Log' cannot be restored to 'L:\test2_Log.ldf'.
Use WITH MOVE to identify a valid location for the file.
Server: Msg 3013, Level 16, State 1, Line 1
RESTORE DATABASE is terminating abnormally.
When I discovered the source of the problem, I inserted
the second back-slash in the move clause, and the restore
worked.
So, there may actually be 2 issues:
1. The extra back-slash that SQL expects to be in the path.
2. The erroneous error message associated with the restore
statement.
Has anyone else seen this?
Steve Phelps
SQL Server DBA
American Fidelity Group|||You are aware you have to restart the SQL Service for changes to the default
directory to take effect ? A little test I did was to create a database when
the default log directory had a trailing backslash and confirm it had a \\
in it's name. Change the default log directory to remove the trailing
backslash and restart the SQL Service. I then detached the database created
earlier, deleted the log file and attached it using
sp_attach_single_file_db. This created the log file with the correct path
(no \\) . The device activation error referred to the path with the \\ in it
because it's reading the original logfile location from the database header
in the mdf file. Hope this makes things a little clearer
--
HTH
Jasper Smith (SQL Server MVP)
I support PASS - the definitive, global
community for SQL Server professionals -
http://www.sqlpass.org
"Steve Phelps" <Steve.Phelps@.af-group.com> wrote in message
news:038101c356e2$dc0aaa70$a501280a@.phx.gbl...
Jasper,
Thanks for the quick response. I had considererd that
possibility and tried both with and without the slash when
I set the default directory. In neither case did I use
the browse button for that purpose. In both cases, the
default directory *appeared* to have been set properly.
Only when I used the sp_attach_single_file_db stored
procedure did the problem arise. My original thought was
that it was a bug in the stored proc. I hadn't looked at
the source code because I assumed that it was encrypted.
It isn't. I parsed out the code that builds the "Create
database" statement and executed it, printing the
resulting command:
CREATE DATABASE [test2]
ON (FILENAME ='f:\sql_data\test2_data.mdf' )
FOR ATTACH
then pasted it into Query Analyzer and executed it. Got
the same result. The new log file's path contains 2
slashes.
Steve Phelps
SQL Server DBA
American Fidelity Group
>--Original Message--
>The extra backslash is due to a bug in EM in that if you
use the browse
>button to select the default directories it appends the
trailing backslash
>to the folder path, SQL actually expects the value to not
have a trailing
>backslash (easily ammended in EM by removing it although
it generally
>doesn't cause problems except the filename looking funny.
Not come across
>the restore issue, I'll try and repro that.
>--
>HTH
>Jasper Smith (SQL Server MVP)
>I support PASS - the definitive, global
>community for SQL Server professionals -
>http://www.sqlpass.org
>"Steve Phelps" <Steve.Phelps@.af-group.com> wrote in
message
>news:014201c356ce$1fa90c50$a601280a@.phx.gbl...
>The scenario:
>The server in question has its Default Log Directory set
>to "L:\". I used sp_attach_single_file_db to attach an
>mdf file. Here is the command that I used:
>EXEC sp_attach_single_file_db @.dbname = 'test2',
> @.physname = 'f:\test2_Data.MDF'
>I received the following error:
>Device activation error. The physical file
>name 'L:\test2_Log.LDF' may be incorrect.
>New log file 'L:\\test2_log.LDF' was created.
>Notice the 2 back-slashes in the path of the log file! I
>didn't notice it at first, and eventually attempted to
>restore the database from a backup taken from a different
>server, using a 'with move'. Here is the command that I
>used for that:
>RESTORE DATABASE test2
> FROM DISK = 'x:\test2.bak'
> with MOVE 'test2_Data' TO 'f:\SQL_Data\test2_Data.mdf',
> MOVE 'test2_Log' TO 'L:\test2_Log.ldf'
>and the response:
>Server: Msg 3156, Level 16, State 1, Line 1
>File 'test2_Log' cannot be restored to 'L:\test2_Log.ldf'.
>Use WITH MOVE to identify a valid location for the file.
>Server: Msg 3013, Level 16, State 1, Line 1
>RESTORE DATABASE is terminating abnormally.
>When I discovered the source of the problem, I inserted
>the second back-slash in the move clause, and the restore
>worked.
>So, there may actually be 2 issues:
>1. The extra back-slash that SQL expects to be in the
path.
>2. The erroneous error message associated with the restore
>statement.
>Has anyone else seen this?
>Steve Phelps
>SQL Server DBA
>American Fidelity Group
>
>.
>