|
It is bad form to try and promote your article in the forums. You will get feedback when people read your article.
|
|
|
|
|
Sigh! I thought I have already been through all this with one person.
Look. I wanted to post my message only once. At one place. But when I get confronted with a message like 'Message Removed', what is a newbie like me supposed to think? That maybe this isn't the right place to post a message like this. This happened to me when I went to post the message elsewhere too. Therefore the repetition.
|
|
|
|
|
You are not a newbie, you have been a member over 7 years. Plenty of time to read the stickies at the top of each forum.
|
|
|
|
|
I am a newbie when it comes to posting an article here.
I am a newbie when it comes to posting a message here.
I am a newbie when it comes to confronting a message like 'Message Removed' once I make an edit to a message here.
I might have joined CodeProject 7 years back. But I was not an active community member here.
|
|
|
|
|
|
This is my table structure and I select "for index types that do not support online index rebuild rebuild indexes offline" option in rebuild maintenance plan(use SQL maintenance plan-SQL Server 2012) but still getting error
failed with the following error: "An online operation cannot be performed for index 'PK_Table1_1' because the index contains column 'FileContent'"; of data type text, ntext
Table structure :
CREATE TABLE [dbo].[Table1]
( [ColumnID] [INT] NOT NULL,
[ColumnName] [NVARCHAR](250) NOT NULL,
[FileContent] [VARBINARY](MAX) FILESTREAM NOT NULL,
[ColumnDate] [VARCHAR](50) NOT NULL,
[UserID] [VARCHAR](50) NOT NULL,
[DefaultColumn] [BIT] NOT NULL,
[ID] [UNIQUEIDENTIFIER] ROWGUIDCOL NOT NULL
CONSTRAINT [DF__TblCheque__ID__398D8EEE] DEFAULT (NEWID()), [ReadOnly] [BIT] NOT NULL,
CONSTRAINT [PK_Table1_1] PRIMARY KEY CLUSTERED ( [ColumnID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF,
IGNORE_DUP_KEY = OFF,
ALLOW_ROW_LOCKS = ON,
ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] FILESTREAM_ON [FileStreamG],
CONSTRAINT [UQ__Table1__3214EC26C89CBB09] UNIQUE NONCLUSTERED ( [ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF,
IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON,
ALLOW_PAGE_LOCKS = ON,
FILLFACTOR = 70) ON [PRIMARY] ) ON [PRIMARY] FILESTREAM_ON [FileStreamG]
modified 26-Oct-16 1:47am.
|
|
|
|
|
The answer is literally in the error-message.
From MSDN[^]
Clustered indexes must be created, rebuilt, or dropped offline when the underlying table contains large object (LOB) data types: image, ntext, text, varchar(max), nvarchar(max), varbinary(max), and xml.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Yes i know that ,The table has varbinary(max) column can not online rebuild ,But i check this option : 'for index types that do not support online index rebuild, rebuild indexes offline' in SQL maintenance plan (SQL Server 2012). So this plan does rebuild it offline but still does online why?
|
|
|
|
|
When you create the table, you are creating an index. Creating is not the same as rebuilding.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
According to this KB article[^], SQL 2012 should be able to rebuild the index online.
Try installing SP3[^], possibly followed by the latest cumulative update[^].
Or, if you've got SP2, install CU4[^] or one of the later cumulative updates[^].
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Hi,
I am executing an SSIS package using my C# code, its saying package executing successfully but unfortunately when I checking in the database, it is not putting the values into that Server, in between the execution
I am getting a Warning that "Configuration from a parent variable "ServerName" did not occur because there was no parent variable collection.", if really package is having issues in connections and variables, how can I set the variable values differently than below, below is the code how I am setting the Package connections and Variables.
It seems that package is using some configurations, is there any way I can modify the existing package configurations at run time using C# code? I don't want to add new configurations to package but want to edit the existing ones to match my needs.
Here is the code, any help would be very very helpful for me thanks in advance my friends.
public void SetRetainSameConnection()
{
try
{
if (_Status == DTSPackageStatus.Loaded)
{
Connections connections = _Pkg.Connections;
if (connections != null)
{
for (int Idex = 0; Idex < connections.Count; Idex++)
{
ConnectionManager connection = connections[Idex];
if (connection.Properties.Contains("RetainSameConnection"))
connection.Properties["RetainSameConnection"].SetValue(connection, true);
}
}
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
throw ex;
}
}
public void SetPakageConnections(System.Collections.Hashtable ConnectionCollection)
{
try
{
if (_Status == DTSPackageStatus.Loaded)
{
Connections connections = _Pkg.Connections;
if ((ConnectionCollection != null) && (connections != null))
{
for (int Idex = 0; Idex < connections.Count; Idex++)
{
ConnectionManager connection = connections[Idex];
string ConName = connection.Name;
if (ConnectionCollection.Contains(ConName))
{
connection.ConnectionString =
ConnectionCollection[ConName].ToString();
}
}
}
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
throw ex;
}
}
public void SetPakageVariables(System.Collections.Hashtable VariableCollection)
{
try
{
if (_Status == DTSPackageStatus.Loaded)
{
Variables variables = _Pkg.Variables;
if ((VariableCollection != null) && (variables != null))
{
for (int Idex = 0; Idex < variables.Count; Idex++)
{
Variable variable = variables[Idex];
string VarName = variable.Name;
if (VariableCollection.Contains(VarName))
{
bool IsReadOnly = (bool)variable.Properties["ReadOnly"].GetValue(variable);
if (!IsReadOnly)
variable.Value =
VariableCollection[VarName].ToString();
}
}
}
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
throw ex;
}
}
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
modified 21-Oct-16 20:20pm.
|
|
|
|
|
We'd like to start using Database Projects to manage our SQL Server code in the same structured way we do the other code. We have a separate domain for development, and we also pass any changes to production to a third party.
Is there a good resource (book, blog etc) on the day-to-day usage of these?
One issue I've got at the moment is that the comparison result script is very protective (which is good). In my case its warning about tables that are being altered could contain data. Now I know that it doesn't matter as these are working tables, so it doesn't matter if they 'loose' data.
Should I be taking the script and then altering it to suit the situation, or have I missed something earlier that would have made VS generate a better script?
Also would people say that the comparison script is the best way to bring a database to the right version or is there a better way? Note the implementers have little to no understanding of the purpose of the database.
|
|
|
|
|
We use them where I work to version stored procedures, permissions and other objects in source control. We also use the generated script(s) to deploy changes. The build process includes a task to compile the db project(s) and generate a diff script for the target environment, and then all the application files, SQL scripts, etc, are pushed to a staging folder on the network, ready for deployment. I personally alter the generated scripts when I come across the warnings of existing data being lost. However, I don't know if that's standard/best practice or not - that's just how I've always done it.
djj55: Nice but may have a permission problem
Pete O'Hanlon: He has my permission to run it.
|
|
|
|
|
Thanks for that, this is what I'm thinking will be the case too.
VS can't *know* everything, so it makes sense that manual alteration of the diff scripts is required.
|
|
|
|
|
Right. It can be a bit tedious. I've been working on a project for a few months, an existing application. The new enhancements/features required that we split some existing tables out (moving a field from one table to a new table). When setting up the deployment, I had to alter the generated diff scripts quite a bit.
djj55: Nice but may have a permission problem
Pete O'Hanlon: He has my permission to run it.
|
|
|
|
|
Hi,
I have a Console Application that is executing an SSIS package, as the SSIS package is taking too long to execute either I want to set time out property max or more execution time.
Currently I am using RetainSameConnection to true on Connection, but if I can set Time out property that would be better either for the Execution or Connection, anything is good.
Can anybody please help me in that regards? Any help either code snippet, a link or even a suggestion would be very helpful - thanks in advance.
Here is my code:
For loading and executing SSIS Pakcage I am using a class
public class DTSPackage
{
private string _PkgLocation;
private DTSPackageStatus _Status;
private Package _Pkg;
private Microsoft.SqlServer.Dts.Runtime.Application _app;
public string PkgLocation
{
get { return _PkgLocation; }
}
public DTSPackageStatus PackageStatus
{
get { return _Status; }
}
public string SsisPackageToBeLoadedFrom { get; set; }
public DTSPackage()
{
_PkgLocation = null;
_Status = DTSPackageStatus.Empty;
_Pkg = null;
_app = new Microsoft.SqlServer.Dts.Runtime.Application();
}
private void DisposePackage()
{
try
{
if (_Pkg != null)
{
_Pkg.Dispose();
_Pkg = null;
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
}
}
public void ClearPackage()
{
_PkgLocation = null;
_Status = DTSPackageStatus.Empty;
DisposePackage();
}
public void LoadPackage(string SsisPackageToBeLoadedFrom = "File System", string PackageName = "", string TargetServer = "WSQL569S,50101", string PackFileSystemLocation = "", string PkgMSDBLocation = "", string SSISPackageStoreLocation = "")
{
/PkgLocation = PkgLocation;/
_Pkg = null;
_PkgLocation = null;
DisposePackage();
try
{
string packageFullPhysicalPath = Path.Combine(PackFileSystemLocation, PackageName);
if (String.IsNullOrEmpty(Path.GetExtension(PkgLocation)))
{
packageFullPhysicalPath = String.Concat(packageFullPhysicalPath, ".dtsx");
}
switch (SsisPackageToBeLoadedFrom)
{
case "File System":
if (File.Exists(packageFullPhysicalPath))
{
_Pkg = _app.LoadPackage(packageFullPhysicalPath, null);
_PkgLocation = packageFullPhysicalPath;
_Status = DTSPackageStatus.Loaded;
}
else
{
throw new ApplicationException("Invalid file location: " + packageFullPhysicalPath);
}
break;
case "MSDB":
_PkgLocation = Path.Combine(PkgMSDBLocation, PackageName);
if (!_app.ExistsOnSqlServer(_PkgLocation, TargetServer, String.Empty, String.Empty))
{
_Pkg = _app.LoadPackage(packageFullPhysicalPath, null);
_app.SaveToSqlServer(_Pkg, null, TargetServer, null, null);
_Status = DTSPackageStatus.Loaded;
}
if (_app.ExistsOnSqlServer(PkgLocation, TargetServer, String.Empty, String.Empty))
{
_Pkg = _app.LoadFromSqlServer(PkgLocation, TargetServer, String.Empty, String.Empty, null);
_Status = DTSPackageStatus.Loaded;
}
else
{
throw new ApplicationException("Invalid package name or location: " + PkgLocation);
}
break;
case "DTS":
_PkgLocation = Path.Combine(SSISPackageStoreLocation, PackageName);
if (!_app.ExistsOnDtsServer(@"\Stored Packages\MSDB" + _PkgLocation, TargetServer))
{
_Pkg = _app.LoadPackage(packageFullPhysicalPath, null);
_app.SaveToDtsServer(_Pkg, null, SSISPackageStoreLocation, TargetServer);
_Status = DTSPackageStatus.Loaded;
}
if (_app.ExistsOnDtsServer(_PkgLocation, TargetServer))
{
_Pkg = _app.LoadFromDtsServer(_PkgLocation, TargetServer, null);
_Status = DTSPackageStatus.Loaded;
}
else
{
throw new ApplicationException("Invalid package name or location: " + _PkgLocation);
}
break;
default:
throw new ApplicationException("Invalid sourceType argument: valid values are 'file', 'sql', and 'dts'.");
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
}
}
public void SetRetainSameConnection()
{
try
{
if (_Status == DTSPackageStatus.Loaded)
{
Connections connections = _Pkg.Connections;
if (connections != null)
{
for (int Idex = 0; Idex < connections.Count; Idex++)
{
ConnectionManager connection = connections[Idex];
connection.Properties["RetainSameConnection"].SetValue(connection, true);<br />
}
}
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
}
}
public void SetPakageConnections(System.Collections.Hashtable ConnectionCollection)
{
try
{
if (_Status == DTSPackageStatus.Loaded)
{
Connections connections = _Pkg.Connections;
if ((ConnectionCollection != null) && (connections != null))
{
for (int Idex = 0; Idex < connections.Count; Idex++)
{
ConnectionManager connection = connections[Idex];
connection.Properties["RetainSameConnection"].SetValue(connection, true);
string ConName = connection.Name;
if (ConnectionCollection.Contains(ConName))
{
connection.ConnectionString =
ConnectionCollection[ConName].ToString();
}
}
}
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
}
}
public System.Collections.Hashtable GetPackageConnections()
{
System.Collections.Hashtable ConnectionCollection =
new System.Collections.Hashtable();
try
{
if (_Status == DTSPackageStatus.Loaded)
{
Connections connections = _Pkg.Connections;
if (connections != null)
{
for (int Idex = 0; Idex < connections.Count; Idex++)
{
ConnectionManager connection = connections[Idex];
string ConName = connection.Name;
string ConStr = connection.ConnectionString;
ConnectionCollection.Add(ConName, ConStr);
}
}
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
}
return ConnectionCollection;
}
public void SetPakageVariables(System.Collections.Hashtable VariableCollection)
{
try
{
if (_Status == DTSPackageStatus.Loaded)
{
Variables variables = _Pkg.Variables;
if ((VariableCollection != null) && (variables != null))
{
for (int Idex = 0; Idex < variables.Count; Idex++)
{
Variable variable = variables[Idex];
string VarName = variable.Name;
if (VariableCollection.Contains(VarName))
{
variable.Value =
VariableCollection[VarName].ToString();
}
}
}
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
}
}
public System.Collections.Hashtable GetPackageVariables()
{
System.Collections.Hashtable VariablesCollection =
new System.Collections.Hashtable();
try
{
if (_Status == DTSPackageStatus.Loaded)
{
Variables variables = _Pkg.Variables;
if (variables != null)
{
for (int Idex = 0; Idex < variables.Count; Idex++)
{
Variable variable = variables[Idex];
string VarName = variable.Name;
object VarVal = variable.Value;
VariablesCollection.Add(VarName, VarVal);
}
}
}
}
catch (Exception ex)
{
_Status = DTSPackageStatus.LoadFailed;
}
return VariablesCollection;
}
public DTSExecResult Execute()
{
return _Pkg.Execute();
}
public DTSExecResult Execute(SSISEventListener Listerner)
{<br />
return _Pkg.Execute(null, null, Listerner, null, null);
}
~DTSPackage()
{
DisposePackage();
}
}
"From if ((connectionValuesFromConfig != null) && (connectionValuesFromConfig.Count > 0)) this statement I using it in the main method."
if ((connectionValuesFromConfig != null) && (connectionValuesFromConfig.Count > 0))
{
foreach (DictionaryEntry entry in existingConnections)
{
tempString = (string)connectionValuesFromConfig[entry.Key.ToString().Replace(" ", string.Empty)];
if (!string.IsNullOrEmpty(tempString) && !string.IsNullOrEmpty(tempString))
toBeSetConnections.Add(entry.Key, tempString);
}
_package.SetPakageConnections(toBeSetConnections);
}
_package.SetRetainSameConnection();
Hashtable existingVariables = _package.GetPackageVariables();
Hashtable toBeSetVariables = new Hashtable();
IDictionary variableValuesFromConfig = WatcherDB.CallGetDtsVariables(ImportMetaDataId);
if ((variableValuesFromConfig != null) && (variableValuesFromConfig.Count > 0))
{
foreach (DictionaryEntry entry in existingVariables)
{
string SourceFileVariableManagerName = string.Empty;
if ((variableValuesFromConfig["SourceFileVariableManagerName"] != null) && !DBNull.Value.Equals(variableValuesFromConfig["SourceFileVariableManagerName"]))
{
SourceFileVariableManagerName = (string)variableValuesFromConfig["SourceFileVariableManagerName"];
}
if (entry.Key.ToString() == SourceFileVariableManagerName)
{
toBeSetVariables.Add(entry.Key, fileFullPath);
}
else
{
tempString = (string)variableValuesFromConfig[entry.Key.ToString().Replace(" ", string.Empty)];
if (!string.IsNullOrEmpty(tempString) && !string.IsNullOrEmpty(tempString))
toBeSetVariables.Add(entry.Key, tempString);
}
}
_package.SetPakageVariables(toBeSetVariables);
}
SSISEventListener Listerner = new SSISEventListener();
DTSExecResult PkgResult = _package.Execute(Listerner);
if (PkgResult.ToString() == "Success")
{
errorDescription = Listerner.ErrorDescription;
Logger.Log(dtsPackPath, "", "FileWatcherScheduleApp", "WatcherScheduler", "ExecutePackage", PkgResult.ToString(), PkgResult.ToString(), (int)Priority.Debug, PriorityTobeLogged);
WatcherDB.UpdateImportedFileDetails(Id);
return true;
}
else
{
errorDescription = Listerner.ErrorDescription;
Logger.Log(dtsPackPath, "", "FileWatcherScheduleApp", "WatcherScheduler", "ExecutePackage", "Error", errorDescription, (int)Priority.LogicalError, PriorityTobeLogged);
}
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
|
|
|
|
|
Hi,
I have been going through a tough time in resolving this issue, I am getting the following error when I am trying to execute a package that's on the MSDB database, it says "Source : Microsoft.SqlServer.Dts.Runtime.TaskHost, ErrorCode : -1073679321, Description : To run a SSIS package outside of SQL Server Data Tools you must install Populate FileCollection of Integration Services or higher."
My code is as below:
static void Main(string[] args)
{
Application app = new Application();
Package loadedPkg = app.LoadPackage(@"\bsc\cscm\edh_imdb\DataExchange\ShieldLink\NewFolder\Brandon\TestSSISPackage\SgLgReporting_SLNewBusinessInProgress.dtsx", null);
app.SaveToSqlServer(loadedPkg, null, "WSQL569S,50101", null, null);
var temp = LaunchPackage("sql", string.Empty, "SgLgReporting_SLNewBusinessInProgress.dtsx");
}
public static int LaunchPackage(string sourceType, string sourceLocation, string packageName)
{
string packagePath;
Package myPackage;
Application integrationServices = new Application();
packagePath = Path.Combine(sourceLocation, packageName);
switch (sourceType)
{
case "file":
if (String.IsNullOrEmpty(Path.GetExtension(packagePath)))
{
packagePath = String.Concat(packagePath, ".dtsx");
}
if (File.Exists(packagePath))
{
myPackage = integrationServices.LoadPackage(packagePath, null);
}
else
{
throw new ApplicationException("Invalid file location: " + packagePath);
}
break;
case "sql":
if (integrationServices.ExistsOnSqlServer("SgLgReporting_SLNewBusinessInProgress", "WSQL569S,50101", String.Empty, String.Empty))
{
myPackage = integrationServices.LoadFromSqlServer("SgLgReporting_SLNewBusinessInProgress", "WSQL569S,50101", String.Empty, String.Empty, null);
}
else
{
throw new ApplicationException("Invalid package name or location: " + packagePath);
}
break;
case "dts":
if (integrationServices.ExistsOnDtsServer(packagePath, "."))
{
myPackage = integrationServices.LoadFromDtsServer(packagePath, "localhost", null);
}
else
{
throw new ApplicationException("Invalid package name or location: " + packagePath);
}
break;
default:
throw new ApplicationException("Invalid sourceType argument: valid values are 'file', 'sql', and 'dts'.");
}
return (Int32)myPackage.Execute();
}
The things I have tried are
- Checked the SQL Server Management Studio for SQL Server version, it is 12.0 and I am also running Sql Server Data Tools Version 12.0.
- Checked the SQL Server Integration Services running on the Server, yes it is 11.0 version is running on the Server.
- Checked various different tools on the Server, they are as below
Microsoft SQL Server Management Studio: 11.0.6020.0
Microsoft Analysis Services Client Tools: 11.0.6020.0
Microsoft Data Access Components (MDAC): 6.3.9600.17415
Microsoft MSXML: 3.0 6.0
Microsoft Internet Explorer: 9.11.9600.18450
Microsoft .NET Framework: 4.0.30319.34014
Operating System: 6.3.9600
- I have Sql Server Client Version installed on my machine, but Server version is not installed, hence I don't have SSIS tools on my local machine from where I am running the Console app to execute SSIS Package. Could it be the problem? Can I just install SSIS tools.
Here is my machine Configuration:
Microsoft SQL Server Management Studio: 11.0.3000.0
Microsoft Analysis Services Client Tools: 11.0.3000.0
Microsoft Data Access Components (MDAC): 6.1.7601.17514
Microsoft MSXML: 3.0 5.0 6.0
Microsoft Internet Explorer: 9.11.9600.18449
Microsoft .NET Framework: 4.0.30319.42000
Operating System: 6.1.7601
Is there anything that I am missing?
I have been searching and googling the whole day to resolve this issue. Any kind help is much appreciated. Thanks in advance my friends.
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
|
|
|
|
|
I needed to have SSIS run time on my machine where I am executing the Console app, or reinstall SQL Server along with Integration Services.
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
|
|
|
|
|
I have an MS Access db that needs to export to Sql Server on daily basis. The access database contains numerous tables and views associated with it. Everyday basis, this gets updated and it's ok to replace the database on daily basis. Sometimes the structure remains the same, but the data in many of the tables will be more on each day. Modifications to existing structure is also possible sometimes.
So, rather than appending data to the existing tables, I would like to replace the whole database on daily basis in Sql Server and perform operations on it.And it should be scheduled at a particular time. So I am looking out which will be the best option to do the same. Will the SSIS packages helpful? Please suggest possible ways.
|
|
|
|
|
meeram39 wrote: Please suggest possible ways. Create a linked server in SQL Server, point to your Access-database, SELECT INTO to where you need them.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Hi,
I am getting the following error,while creating identity column in sql ???
This is the code:
Create Table Myschema.Employee_Demo
(
Emp_ID int constraint pk_con Primary Key Identity(1,1),
Emp_Name varchar(55),
Emp_Sal decimal(10,2)
)
Insert Myschema.Employee_Demo(Emp_Name,Emp_Sal)values('Puja',20,000),('Tira',10,000),('Serrma',15,000),('Airrwari',12,000),('Shyyjina',13,000)
ERROR:
Msg 110, Level 15, State 1, Line 1
There are fewer columns in the INSERT statement than values specified in the VALUES clause. The number of values in the VALUES clause must match the number of columns specified in the INSERT statement.
SO,
how to solve this error???Please help.......
Thanks...
|
|
|
|
|
Take the delimiter out of the monetary walues - these are for display only, not to be stored in the database. The SQL parser sees the comma in the figures as a column delimiter
Insert Myschema.Employee_Demo(Emp_Name,Emp_Sal)values('Puja',20000),('Tira',10000),('Serrma',15000),('Airrwari',12000),('Shyyjina',13000)
=========================================================
I'm an optoholic - my glass is always half full of vodka.
=========================================================
|
|
|
|
|
When you are inserting a record, The value for Emp_Sal should not contain a comma(,).
That is ('Puja',20,000) should be written as ('Puja',20000)
|
|
|
|
|
Hello all. I am writing a C# program. The program downloads an rdl report and then examines the SharedDataSetReference tag of the XML.
From this information, I can get the server path to the shared dataset. What I need to do is to use the shared dataset name to get the name of the stored procedure that the dataset gets it's information from. I assume that I would query the Report Server Database. The dataset uses a shared datasource as well. I have been trying to figure out how to do this for a while now, and I cannot.
I am not giving up though. If I figure it out, I will come back and post the solution, but I would appreciate any help I can get.
Thanks,
David
|
|
|
|
|
From reading this[^] the shared datasource may have nothing to do with a database but is a file on the REPORT server, its source may be anything, Excel, CSV, or a web service query.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|