Friday, March 30, 2012

One Node One Machine Cluster?

Hi,
I am in the process of building a SQL Server "farm," but having to tone down
the pace due to $$. I have just purchased a RAID1 (2 disc/10K rpm/146 GB)
Channel 0, RAID5 (3 disc/10K rpm/146 GB) Channel 1 server with 2 CPU's. This
summer, I plan to buy an identical machine. They will both run Win 2003 EE,
with SQL Server 2000 EE. Rather than install SQL Server locally on my
existing machine, is there a problem with installing MSCS and a virtual
server, even though I only have the one machine now. Of course, it won't
give me failover protection, but it would give me the configuration I want
when I get my second box. Does this make any sense? Any caveats or words of
wisdom?
PS. The servers will be accessed in OTLP mode by an ADO.NET app running on
Terminal Server (Win 2003), and BizTalk Server 2004 - also on a Win 2003 box.
Most of the ADO.NET traffic will occur during business hours, with BTS
getting busy after hours.
Thanks.
John
Hi
Why don't you rather build a single node cluster? Build a normal cluster
node, but not add another node. Once you have bought your other machine, add
it to the cluster.
Just be aware, it is very difficult to get the exact same hardware months
apart. If you are going to use clustering, make sure all the hardware is on
the Windows Hardware Compatibility List (HCL).
A cluster "on the cheap" will just cause you hassles later. I've seen enough
"lets do it with a s little as we can", and the cluster ends up with worse
uptime than a single machine.
Regards
Mike
"JT" wrote:

> Hi,
> I am in the process of building a SQL Server "farm," but having to tone down
> the pace due to $$. I have just purchased a RAID1 (2 disc/10K rpm/146 GB)
> Channel 0, RAID5 (3 disc/10K rpm/146 GB) Channel 1 server with 2 CPU's. This
> summer, I plan to buy an identical machine. They will both run Win 2003 EE,
> with SQL Server 2000 EE. Rather than install SQL Server locally on my
> existing machine, is there a problem with installing MSCS and a virtual
> server, even though I only have the one machine now. Of course, it won't
> give me failover protection, but it would give me the configuration I want
> when I get my second box. Does this make any sense? Any caveats or words of
> wisdom?
> PS. The servers will be accessed in OTLP mode by an ADO.NET app running on
> Terminal Server (Win 2003), and BizTalk Server 2004 - also on a Win 2003 box.
> Most of the ADO.NET traffic will occur during business hours, with BTS
> getting busy after hours.
> Thanks.
> --
> John

One node Cluster for SQL 2005

Hi anyone know if you can cluster sql on a one node cluster in Windows 2003 ?
Regards Craig
Yes you can. You can even cluster SQL 2005 in Virtual Server with two
nodes, but it takes a bit of work.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"Craig Gibb" <Craig Gibb@.discussions.microsoft.com> wrote in message
news:E4E75A9E-46BC-4DB4-9D22-B5862BBAFBF1@.microsoft.com...
> Hi anyone know if you can cluster sql on a one node cluster in Windows
> 2003 ?
> Regards Craig
>
|||Hi thanks for the info Geoff just now iam having a problem with DTC and
windows 2003 sp1 i thin there is COM + rollup package for the problem,
but i can′t find it anywhere KB 897667. MS support is closed here and i
would really like to install it this evening any ideas ?
Regards Craig
"Geoff N. Hiten" wrote:

> Yes you can. You can even cluster SQL 2005 in Virtual Server with two
> nodes, but it takes a bit of work.
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
>
> "Craig Gibb" <Craig Gibb@.discussions.microsoft.com> wrote in message
> news:E4E75A9E-46BC-4DB4-9D22-B5862BBAFBF1@.microsoft.com...
>
>
|||Here is what I usually have to do with DTC on a Windows 2003 Server.
HOWTO: Enable DTC Between Web Servers and SQL Servers Running Windows Server
2003
http://support.microsoft.com/kb/555017/en-us
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"Craig Gibb" <CraigGibb@.discussions.microsoft.com> wrote in message
news:304D6E36-D457-4FDB-A46D-D3941CA3DF86@.microsoft.com...[vbcol=seagreen]
> Hi thanks for the info Geoff just now iam having a problem with DTC and
> windows 2003 sp1 i thin there is COM + rollup package for the problem,
> but i cant find it anywhere KB 897667. MS support is closed here and i
> would really like to install it this evening any ideas ?
> Regards Craig
> "Geoff N. Hiten" wrote:

One ms sql server 2000 instance blocks the other (thru a dts-job)

Hello,

we use two instances of the ms sql server 2000 (Version: 8.00.760) on a
4 processor windows 2000 (sp 4) machine.
A dts-packet on one instance completly blocks the work on the other
instance. Both instances have 2 dedicated processors. The two
processors of the blocked instance are in idle states.
How is this possible? Has someone had the same behavior of the sql
server and found a solution for this problem?

Thanks in advance for answers

Jrgen Simonsen(juergen.simonsen@.web.de) writes:
> we use two instances of the ms sql server 2000 (Version: 8.00.760) on a
> 4 processor windows 2000 (sp 4) machine.
> A dts-packet on one instance completly blocks the work on the other
> instance. Both instances have 2 dedicated processors. The two
> processors of the blocked instance are in idle states.
> How is this possible? Has someone had the same behavior of the sql
> server and found a solution for this problem?

Have you tried to access that blocked instance, to see if it is possible
to run queries on it? Have you checked the SQL Server error log for the
blocked instance?

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

one more time

i′m sorry my bad english

i have a dts in sql server 2005

for example in sql server 2000 i can do that :

over DTS right-clik choose option "Scheduled pachage" and this create a scheduled what i mean is that this produces a "JOB"

what i mean is that i need to know how can to create a scheduled to DTS′s in sql 2005

Please anyone of you provide some solution for this.

Okay, I completely misunderstood what you wanted before. But, I think I can help you now.

Once you have the SSIS package (DTS packages mean something else - like packages from SQL SERVER 2000 DTS) you can use SQL Server Agent to shedule the package to be run at a certain time.

Use the SQL Server Agent to schedule your package.

|||

sorry man:

what i do in sql server 2005 is:

right-click over SQL Server Agent , choose the option "New" and choose option "Schedule" but isn't what i want, may be i am not specfic

when i do right-click over dts in sql server 2000 and choose the option schedule package this produces an scheduled and i can see it in the part of the jobs

Now how i can do that in sql server 2005? i need to see of my DTSs the jobs that i created and you said me that i can do by SQL Server Agent in SQL server 2005 and this isn′t do what i want.

Please anyone of you provide some solution for this.
Thanks :)

|||

limon26,

I merged this post into your earlier thread. If you want to reply to another reply you should use the reply button and not the new thread button. This will keep the discussion in one thread and others that can help will be able to follow the discussion more easy.

Thanks.

|||

i apologies for that

so what can i do about my DTSs in sql server 2005 ?

|||

I am not that experienced with SSIS for 2005, hopefully swells or somebody else will be able to help you further...

|||

In SQL Server 2000, you could create a schedule by right-clicking on a package in Enteprise Manager and selecting "Schedule Package." SQL Server 2000 created the SQL Server Agent job. You could find the jobs in Enterprise Manager under the Management folder, in the jobs tree.

In SQL Server 2005 there are some changes ...

1. SQL Server Agent is not automatically enabled. Make sure the service has been started.

2. You do not automatically create scheduled jobs for SSIS. You should right click on the SQL Server Agent icon in the SQL Server Object Explorer in the Management Studio and select to create a new Agent Job. When you do that, you can add a new job step to execute an SSIS package. You can also add a new schedule.

The end result is similar to 2000 - jobs are listed under the SQL Server Agent node; jobs have steps - one step executes a package; and jobs have schedules.

An important difference is that SSIS now has an Agent subsytem for more better integrated execution.

hth

Donald

|||

After create your ssis package. Here is step you have to do to make a job run....

My ssis is running every 10 min.....

Here is what I do to make it work.......

Create the proxy account, And SQL Server Agent running under your NT Acount and then the job Steps running under the "SQL Agent Service Account" and your nt account as the owner of the job. It will works perfect.

I don't know if you know how to create proxy account. Here is how to create:

Use master

CREATE CREDENTIAL [MyCredential] WITH IDENTITY = 'yourdomain\myWindowAccount', secret = 'WindowLoginPassword'

Use msdb

Sp_add_proxy @.proxy_name='MyProxy', @.credential_name='MyCredential'

Sp_grant_login_to_proxy @.login_name=' devlogin', @.proxy_name='MyProxy'

Sp_grant_proxy_to_subsystem @.proxy_name='MyProxy', @.subsystem_name='SSIS'

Use master

CREATE CREDENTIAL [MyCredential] WITH IDENTITY = 'yourdomain\myWindowAccount', secret = 'WindowLoginPassword'

Use msdb

Sp_add_proxy @.proxy_name='MyProxy', @.credential_name='MyCredential'

Sp_grant_login_to_proxy @.login_name=' devlogin', @.proxy_name='MyProxy'

Sp_grant_proxy_to_subsystem @.proxy_name='MyProxy', @.subsystem_name='SSIS'

kenny

|||Similar problems, but (so far) the credentials/proxy doesn't seem to resolve it.

We have two people who have developed some SSIS packages, myself and

another developer. The other developer's SSIS package runs fine from

BIDS, DTexec and DTexecUI, but not from SQL Server agent. My SSIS

package runs fine from BIDS, DTexec, DTexecUI and local SQLServer

Agent. I haven't had the opportunity to have it deplyed into another

environment to check if I have problems from the server.

What was interesting to me, in attempting to test the credentials, was

that the working SQL Server Agent job, it I edit it to use the proxy,

fails. If I edit the job back to using the SQL Agent Service Account

(on the RUN AS tab on the task), then it works successfully.

Now this may be because I've set up the credential incorrectly, since

it was my first time (be gentle :-)). I'm going to try and modify the

script above and see if that works any better. The credential I set up

used a local user I created that was part of the powerusers group. The

package does file operations locally (based on the current

configuration file). The 'ProtectLevel' was set to

'EncryptSenstitiveWithPassword', and this specific test job is using

SQL Server as the package source.

Nay help? The KB http://support.microsoft.com/kb/918760 states that

this is ' This behavior is by design.' - but surely it should also be

easy to use?|||using the scipr (after removing the duplicate, and adding exec's), I

generated a proxy and a credential. The user being used is the same one

I create earlier (which is a member of the power user group on my

laptop). I get the following error:

-

Date 25/07/2006 14:21:59

Log Job History (Test Excel Orchestration)

Step ID 1

Server MyLaptop\SQL2005_SP1_TEST

Job Name Test Excel Orchestration

Step Name Invoke Excel Orchestration

Duration 00:00:01

Sql Severity 0

Sql Message ID 0

Operator Emailed

Operator Net sent

Operator Paged

Retries Attempted 0

Message

Executed as user: MyLaptop\SSISAgentJobCredUser. The package could not be loaded. The step failed.

-

this proves the credential at least worked - it did try to run with the correct user, but still no joy.

.... even less joy now, since I now get the following, trying to use this in the way that worked before:
Date 25/07/2006 14:27:45

Log Job History (Test Excel Orchestration)

Step ID 1

Server MyLaptop\RGALBRAITH\SQL2005_SP1_TEST

Job Name Test Excel Orchestration

Step Name Invoke Excel Orchestration

Duration 00:00:00

Sql Severity 0

Sql Message ID 0

Operator Emailed

Operator Net sent

Operator Paged

Retries Attempted 0

Message

Non-SysAdmins have been denied permission to run DTS Execution job steps without a proxy account. The step failed.|||

Hey,

Today I got the same error as you, to fixed I didn't create the credentials, I changed the owner for the Scheduled Jobs instead, and it's works great!

|||

I fixed this by adding the owner of the package to the owned schema's in the agent roles in msdb.

take a look at this link:

http://www.microsoft.com/technet/prodtechnol/sql/2005/newsqlagent.mspx

One more solution for a pagination

Let me know what you think about the following code:
DECLARE @.MaxIdValue int
DECLARE @.MaxSortFieldValue nvarchar(50)

SELECT TOP 1 @.MaxIdValue = [id], @.MaxSortFieldValue = [SortField]
FROM (
SELECT TOP PageNumber*RowsPerPage [id], [SortField]
FROM MyTable
WHERE (FilterCondition) ORDER BY [SortField], [id]
) T
ORDER BY [SortField] DESC, [id] DESC

SELECT TOP RowsPerPage * FROM MyTable
WHERE ([SortField] >= @.MaxSortFieldValue) AND (([id] > @.MaxIdValue) OR
([SortField] <> @.MaxSortFieldValue)) AND (FilterCondition)
ORDER BY [SortField], [id]

This is a dynamic SQL and it should be easily fixable.
PageNumber, RowsPerPage, FilterCondition and SortField are going to be
the variables and will be based on the user's search
condition/criteria.
----------
Thanks for you attention."Kurzman" <max@.virtuman.com> wrote in message
news:d8f86969.0408051335.56e477a2@.posting.google.c om...
> Let me know what you think about the following code:
> DECLARE @.MaxIdValue int
> DECLARE @.MaxSortFieldValue nvarchar(50)
> SELECT TOP 1 @.MaxIdValue = [id], @.MaxSortFieldValue = [SortField]
> FROM (
> SELECT TOP PageNumber*RowsPerPage [id], [SortField]
> FROM MyTable
> WHERE (FilterCondition) ORDER BY [SortField], [id]
> ) T
> ORDER BY [SortField] DESC, [id] DESC
> SELECT TOP RowsPerPage * FROM MyTable
> WHERE ([SortField] >= @.MaxSortFieldValue) AND (([id] > @.MaxIdValue) OR
> ([SortField] <> @.MaxSortFieldValue)) AND (FilterCondition)
> ORDER BY [SortField], [id]
> This is a dynamic SQL and it should be easily fixable.
> PageNumber, RowsPerPage, FilterCondition and SortField are going to be
> the variables and will be based on the user's search
> condition/criteria.
> ----------
> Thanks for you attention.

What I think is that your syntax is wrong :-)

If you want a pagination solution, then the usual answer is "do it on the
client side" - this article may be useful:

http://www.aspfaq.com/show.asp?id=2120

Simon|||> What I think is that your syntax is wrong :-)
> If you want a pagination solution, then the usual answer is "do it on the
> client side" - this article may be useful:
> http://www.aspfaq.com/show.asp?id=2120
> Simon

The client side is't primary. The bottleneck of pagination is a server
side, especially for a big tables. From this point of view following
code will works slowly. Also it will works correct only if ArtistName
+ '~' + Title is unique.
ArtistName + '~' + Title
>= @.aname + '~' + @.title

Following code hasn't this problem:

([SortField] >= @.MaxSortFieldValue) AND (([id] > @.MaxIdValue) OR
([SortField] <> @.MaxSortFieldValue)) AND (FilterCondition)
ORDER BY [SortField], [id]
If index [SortField], [id] exist it works fast.

P.S. Thanks for a link to intresting article.sql

One more question regarding Reporting action

All the reporting action samples I read from AdventureWorks, when pass link and parameter to the report in reporting service, is linked to the report as Analysis Service data source based report.

I am wondering if reporting action can pass parameter to regular relational db based reporting service report not Analysis Service data source based report.

If yes, could you share a simple sample?

I can't see why the data source of a report would matter, when using a reporting action. The Adventure Works examples may all be using an Analysis Services data source just for convenience. Have you tried editing an existing Adventure Works reporting action, to point to a relational report instead?|||

Hi Deepak, thank you so much for your reply! Yes, I did try the following using AW AS and AW sample reports RS projects:

Here is my reporting action:

Name: Sales Order Detail

Target Type: Dimension members

Target object: Internet Sales Order Detail

ReportServer:localhost

ReportPath:ReportServer?/AdventureWorks Sample Reports/Sales Order Detail

Parameters NameTongue TiedalesOrderNumber (Match the one at Sales Order Detail report

Parameter Value: [Internet Sales Order Details].[Sales Order Number].[Sales Order Number].CurrentMember.Name

And I make sure Sales Order Detail Report is running correctly and parameter is matching the parameter name I defined at the cube reporting action. Then I deploy the cube, however when drop the Sale Order Number dimension and right click, there is no Reporting Action showing up.

Could you give me some pointers?

Thanks

|||

Did you configure the "Caption", under "Additional Properties" - if not, you can refer to the Adventure Works "Sales Reason Comparisons" action, whose Caption is defined as an MDX expression:

"Sales Reason Comparisons for " +

[Product].[Category].CurrentMember.Member_Caption + "..."

|||

Thanks again. Just did that and deployed.

"Sales Reason Comparisons for " + [Internet Sales Order Details].[Sales Order Number].[Sales Order Number].CurrentMember.Member_Caption + "..."

However reporting action is still not showing. Deepak, anything else I miss?

Thanks

|||Can't think of anything else - are you able to see the standard Reporting action which comes with Adventure Works, ie. "Sales Reason Comparisons"? If so, I would suggest making a test copy of Adventure Works, and progressively modifying this action to incorporate your report in small steps, till it breaks - or maybe it doesn't break!

One More Question On Running Parallel Queries

An Execute SQL task takes 1 min to run a statement "insert into Mytable select * from view_using_joins"

Output: 10,225 rows affected.

But a Dataflow task configured to fetch data from the same view_using_joins into MyTable takes hours to do the same.

Could you please explain why is it so ?

Thanks

Subhash Subramanyam

I am not surprised that the Execute SQL Task is quicker. When you're inserting from one table to another in the same database then SSIS isn't going to outperform the database engine.

I don't know why it is hours quicker. There isn't really enough information here to say. What destination adapter are you using? Is the package running on the same machine as the database? Are you doing transformations on the way?

-Jamie

|||

Hi Jamie,

Thanks for your reply.

1) I am running packages on a different server.

2) Using OLEDB adapters for Source and Destination . Here Database as well as the server are same for Source and Destination

3) No transformations in between

Wonder if we have to do some extra settings to here to achieve the same duration as that of execute sql task ?

Thanks

Subhash Subramanyam

|||

Have you chosen Fast Load on your destination?

Thanks.

|||

Subhash512525 wrote:

Hi Jamie,

Thanks for your reply.

1) I am running packages on a different server.

2) Using OLEDB adapters for Source and Destination . Here Database as well as the server are same for Source and Destination

3) No transformations in between

Wonder if we have to do some extra settings to here to achieve the same duration as that of execute sql task ?

Thanks

Subhash Subramanyam

You're running the package on a different server? I'd suggest that's yur problem right there. The data is going to have to go over the network - obviously this is going to take time.

Also, in your destination are you inserting with Fast Load?

I say again, in this scenario the data-flow isn't going to outperform the Execute SQL Task.

-Jamie

|||

Great Jamie, you figured out.

If you don't mind spending few minutes here, I am coming back to my actual scenario.

I surely expect specific views from experts here for each of the questions here: Phil B, Rafael S, Darren G, Jwelch, JayH, Scott B, Ashwin S, Brian, Bob, Donald F and many others I am still not aware of.

Scenario:

1) My SSIS Packages are run at US server. scheduleld during Nights.

2) Each Package runs 6-8 queries each having Joins Parallelly pulling data from Oracle Database Source (UNIX) in Europe, Total Data extracted do not exceed 5 Million rows)

3) Destination Database is at US.

4) Network Bandwidth (2 Mbps)

Problem is that It almost takes ages to execute these Packages (Ranging from 25 hours to 30 hours)

Questions are:

1) Where should I expect to run the SSIS Packages to give a better performance?

2) How can I perform only incremental load (using Dataflow task) taking into consideration performance aspects? (Any links for this can help)

3) Does the overlap of the Schedules for SSIS packages afffect the performance?

4) Are there any limits on running number of queries in parallell to pull data from oracle source

5) Will it be the best way, If I spool the query results into flat files on a local system where the source (oracle database) runs at Europe and then ftp them to a shared server at US, which I can use it for importing into Destination table

Waiting for your reply,

Many Thanks and Regards

Subhash Subramanyam

|||Thanks Bob, Please give your answers for my below questions if don't mind.|||The more work you can do to prevent keeping the data transmission "pipe" open, the better.

Perform your source query in Europe, export that to a file, compress it, and then FTP it to the US. Then uncompress it, and load it with SSIS.

The idea is to keep your transmissions across "the pond" as short as possible.|||

From your question #2, I'm assuming you are pulling all rows every night. As Phil mentioned, you want to minimize how much data you are actually moving, so I'd definately make this incremental. A common way to implement that is by checking modified dates on the source system via a WHERE clause in your source SELECT statements. Store the range of modified dates that you retreive, and when the package is run the next night, start from the end of the previous range.

If you don't have modified dates in the source system, consider adding them. Alternatives are using triggers to track changes, or using a change data capture tool - I believe Oracle has one, and SQL Server will have one with SQL Server 2008.

|||

One more question:

6) If I have 6-8 queries running in parallel, Whether having a common connection Manager (for an Oracle source) for all performs better or having Distinct Connection Manager performs better ?

Still expecting suggestions and the views of rest of the experts for six questions listed here.

Regards

Subhash Subramanyam

|||

Subhash512525 wrote:

6) If I have 6-8 queries running in parallel, Whether having a common connection Manager (for an Oracle source) for all performs better or having Distinct Connection Manager performs better ?

It depends Smile Using a single one should result in the same performance as having several, assuming you are not using RetainSameConnection on them. Having a single connection manager doesn't mean that SSIS won't open multiple connections to the database. A Connection Manager manages multiple connections to the database, unless you force it to use only a single connection with RetainSameConnection.

A related note - in your scenario, have you tested whether performance is better if you run all queries sequentially or in parallel (by using precedence constraints on the data flow tasks)?

|||

jwelch wrote:

A related note - in your scenario, have you tested whether performance is better if you run all queries sequentially or in parallel (by using precedence constraints on the data flow tasks)?

Jwelch, This seem more practical. I'll test this and let you know..

Thanks

Subhash