|
Hi,
I have an asp.net application and using connection string to connect to database.
But I am getting below error.
Connection open and login was successful, but then an error occurred while enabling MARS for this connection. (provider: Name Pipes Provider, error:15- Function not supported)
I am using Asp.net framework 4.5 and SQl server 2012
DataTable dt = new DataTable();
SqlConnection objcon = new SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["test"].ToString());
using (var command = new SqlCommand("SELECT * FROM testData", objcon))
{
var formCollection = new List<Form>();
try
{
objcon.Open();
command.CommandTimeout = 3000;
var res = command.ExecuteReader();
objcon.Close();
lblID.Text ="Success";
}
catch (Exception ex)
{
objcon.Close();
lblID.Text = ex.Message;
}
}
|
|
|
|
|
I'd guess you're enabling MARS in your connection-string, and the server somehow refuses to enable it. If you are not using an Express-edition, I'd recommend checking if the feature is installed and turned on.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Hi All,
I am trying to access PackageLevel variables inside my Script Components ProcessInputRow event of my SSIS package, but Its giving me following error
An exception of type 'Microsoft.SqlServer.Dts.Pipeline.ReadWriteVariablesNotAvailableException' occurred in Microsoft.SqlServer.TxScript.dll but was not handled in user code
Additional information: The collection of variables locked for read and write access is not available outside of PostExecute.
And my code is as below.
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
IDTSVariables100 vars = null;
VariableDispenser.LockForRead("System::TaskName");
VariableDispenser.GetVariables(out vars);
string TaskName = vars["System::TaskName"].Value.ToString();
vars.Unlock();
var watcherDBConnectionString
= "Data Source=" + Variables.TargetServer01.ToString() + ";Initial Catalog=" + Variables.WatcherDB.ToString() + ";" + Variables.Security.ToString();
var ErrorLogStoredProcedure = ReadOnlyVariables["ErrorLogStoredProcedure"].Value.ToString();
int PriorityToBeLogged = (int)Priority.Debug;
Log(Variables.PackageName.ToString(), "", "", "ScriptMain", "Main", "Row.FileName : " + Row.FileName
+ ", Row.File.Length : " + Row.File.Length.ToString(), "ABDUL"
, Priority.LogicalError, PriorityToBeLogged, watcherDBConnectionString.ToString(), ErrorLogStoredProcedure.ToString());
}
Can anybody please help me in this? Any code snippet, a link or even a suggestion would help, can't I access or assign the Variables in the ProcessInputRow event of Script Component?
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
|
|
|
|
|
So I'm back to that Foxpro database again in a VB app, and I thought I'd try and get even more efficient in loading data.
I'm trying to join a billing and address dbf file into a union. I think it will work, but I get an error saying that I'm missing an operator, in which I think I have the brackets in the wrong format.
The single join works, using the left join and right join on the union, it's when I added the 2nd join it failed.
Not much on this out there when searching.
Possible to take a look at and perhaps reflect back 18 years on this?
Dim connString As String = "Provider=Microsoft.Jet.OLEDB.4.0; Data Source=" & m_path & "; Extended Properties=dBASE IV"
Const queryString As String = _
"SELECT " & _
" h.FINVNO " & _
", h.FSONO " & _
", h.FCUSTNO " & _
", h.FCOMPANY " & _
", h.FSALESPN " & _
", h.FSHIPDATE " & _
", h.FPONO " & _
", h.FCSAMT " & _
", h.FTAXAMT1 " & _
", h.FBADDRCD " & _
", h.FSADDRCD " & _
", hbA.FCOMPANY " & _
", hbA.FADDR1 " & _
", hbA.FADDR2 " & _
", hbA.FCITY " & _
", hbA.FSTATE " & _
", hbA.FZIP " & _
" FROM ARINV01H.dbf h " & _
" LEFT JOIN ARADD01H.dbf hbA ON (h.FINVNO = hbA.FINVNO) " & _
" LEFT JOIN ARCUS01.dbf hsA ON (h.FCUSTNO = hsA.FCUSTNO) " & _
" WHERE " & _
" h.FSALESPN = @FSALESPN " & _
" AND " & _
" h.FSHIPDATE >= @startDate AND h.FSHIPDATE <= @stopDate " & _
"UNION ALL " & _
"SELECT " & _
" v.FINVNO " & _
", v.FSONO " & _
", v.FCUSTNO " & _
", v.FCOMPANY " & _
", v.FSALESPN " & _
", v.FSHIPDATE " & _
", v.FPONO " & _
", v.FCSAMT " & _
", v.FTAXAMT1 " & _
", v.FBADDRCD " & _
", v.FSADDRCD " & _
", vbA.FCOMPANY " & _
", vbA.FADDR1 " & _
", vbA.FADDR2 " & _
", vbA.FCITY " & _
", vbA.FSTATE " & _
", vbA.FZIP " & _
" FROM ARINV01.dbf v " & _
" RIGHT JOIN ARADD01.dbf vbA ON (v.FINVNO = vbA.FINVNO) " & _
" RIGHT JOIN ARCUS01.dbf vsA ON (v.FCUSTNO = vsA.FCUSTNO) " & _
" WHERE " & _
" v.FSALESPN = @FSALESPN " & _
" AND " & _
" v.FSHIPDATE >= @startDate AND v.FSHIPDATE <= @stopDate "
21st Century Globalism has become Socialism on a planetary scale, in which the unequal treaties of the past have come back into play.
|
|
|
|
|
jkirkerx wrote: The single join works, using the left join and right join on the union, it's when I added the 2nd join it failed. What does that mean? Did it throw an error? If yes, which, if no, what was the difference between the expected and the actual result?
jkirkerx wrote: I'm back to that Foxpro database again in a VB app There's the problem
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Sorry for the late reply,
I'll figure it out one day. Hate working on this but it's good money.
21st Century Globalism has become Socialism on a planetary scale, in which the unequal treaties of the past have come back into play.
|
|
|
|
|
Hi,
I have and SSIS Package, my lead wants me to create a C# application that sets Variables, Connections and everything properly and executes the package, I did that, I even have written OnError event and logged the Error why SSIS failed to execute. As far I saw there is no difference between my Application and SQL Job. In fact my Application is logging every detail and placing all details as I am logging many things and its priority based logging means logs depending upon the Environment. But just asking, is it good Idea to write my own Application or did I do wrong?
My lead wanted to do it this way because there are many SSIS packages and because the SQL jobs they are running sometimes same time and creating lot of mess. But my application does it synchronously and it combines this synchronous process with another Asynchronous process of FileSystemWatcher, which keeps a record into the Database when file is dropped and then my App would take those details of path etc sets the connections and variables then calls the SSIS packages Synchronously and kills the execution of SSIS package if its taking too long. All these features are implemented by using C# code, if anybody has better advise please let me know their opinions.
Just want to calculate pros and cons from some other developers just to check my process, in fact I believe all Programmers are always learners .
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
|
|
|
|
|
Using C# to kick off SSIS jobs is just fine. I've done it too and it works great. It sounds like you built a good system.
There are two kinds of people in the world: those who can extrapolate from incomplete data.
There are only 10 types of people in the world, those who understand binary and those who don't.
|
|
|
|
|
Thanks my friend
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
|
|
|
|
|
Yeah, I've worked on systems where we built C# (and dare I say it, VB6) apps to monitor, trigger, kill SSIS packages. To be fair most of them began their existence before MS introduced all of the help that is now available with SQL Jobs... but the extra level of *specific* logging far outweighed the "out of the box" solution. When you are in a situation where there are many packages / jobs/ etc sometimes that extra level of customisation can pay for itself
|
|
|
|
|
Thanks my friend, yeah logging was needed I just want to know what's going on inside of its execution, but its priority based though that, in Dev it sets to lowest priority but in Prod it writes only Exceptions and Failures only.
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
|
|
|
|
|
I have this flattened database design, and in checkout, I need to get the total weight of the cart items.
So I did a method call to the shopping cart and I have the items in a List,
But I need to get the Product Dimensions in another table that I did not join, because I used the database table to store the data in and I don't have a place to store the dimensions.
I'm not sure if I should rewrite this again, or if there is a quick fast way to get the dimensions.
I really don't want to make another model to do this.
if (context.SHOPPING_CART.Any(m => m.customer_accountName == aN))
sCs = context.SHOPPING_CART.Where(m => m.customer_ID == cID).ToList();
if (sCs != null)
{
orderTotal = sCs.Sum(m => m.item_Price_Value * m.item_Qty);
orderCost = sCs.Sum(m => m.item_Cost_Value * m.item_Qty);
orderProfit = orderTotal - orderCost;
orderWeight = "was hoping for quick method here"
}
PRODUCT_DIMENSIONS
Weight_Gravity is the field that I want to join so I can sum.
Globalism is Socialism on a planetary scale.
|
|
|
|
|
What's the structure of the database / EF model?
And are you really only going to allow each customer to have a single cart? Or are you going to delete the cart once it's processed?
jkirkerx wrote:
if (context.SHOPPING_CART.Any(m => m.customer_accountName == aN))
sCs = context.SHOPPING_CART.Where(m => m.customer_ID == cID).ToList();
Not sure what purpose that "if" serves? The condition doesn't match the following Where clause; and even if it did, it would only slow down the performance of your code.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
On the design of the cart, I decided to use a field called "Account Name". At first when an account does not exist, I use the session number in the "Account Name" field. Then I try to lure the shopper into creating an account or upon checkout, they create an account. Once the account is created, I write a cookie with their info and swap the session number with the account name. Now the stage is set for each time the cart page is loaded, I can load their items automatically, from any device they use such as phone or desktop.
Upon PlaceOrder, the Linq I'm asking about, the cart content will be copied to another table and the shopping cart will be deleted.
This is the program at it's current state,
[Project Indigo] - test drive store.
There's a table called "Product_Info" that contains the master product information, In which "Product_Dimensions" can be joined to.
[Table("dbo.SHOPPING_CART")]
public class SHOPPING_CART
{
[Key()]
[Column(Order = 1)]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int cart_ID { get; set; }
[Required]
public int customer_ID { get; set; }
[Required]
[MaxLength(80)]
public string customer_accountName { get; set; }
[Required]
public DateTime timeStamp { get; set; }
[Required]
[MaxLength(80)]
public string item_Type { get; set; }
[Required]
public int item_ID { get; set; }
[Required]
public int item_Qty { get; set; }
public bool item_Taxable { get; set; }
[MaxLength(80)]
public string item_SKU { get; set; }
[MaxLength(160)]
public string item_Description_Title { get; set; }
[Required]
public decimal item_Cost_Value { get; set; }
public decimal item_RollCharge_Value { get; set; }
[Required]
public decimal item_Price_Value { get; set; }
[Required]
public decimal item_Total_Value { get; set; }
public int item_Avatar_ID { get; set; }
public byte[] item_Avatar_Data { get; set; }
[MaxLength(210)]
public string item_Avatar_Url { get; set; }
[MaxLength(10)]
public string type_Currency { get; set; }
[MaxLength(10)]
public string type_Weight { get; set; }
[MaxLength(80)]
public string type_Unit { get; set; }
public int vendor_ID { get; set; }
[MaxLength(160)]
public string vendor_Name { get; set; }
public int brand_ID { get; set; }
[MaxLength(160)]
public string brand_Name { get; set; }
public int department_ID { get; set; }
[MaxLength(160)]
public string department_Name { get; set; }<br />
}
[Table("dbo.PRODUCT_DIMENSIONS")]
public class PRODUCT_DIMENSIONS
{
[Key()]
[Column(Order = 1)]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int DimensionID { get; set; }
public int ProductID { get; set; }
public decimal Weight_Gravity { get; set; }
[MaxLength(6)]
public string Weight_Type { get; set; }
public decimal Size_Length { get; set; }
public decimal Size_Width { get; set; }
public decimal Size_Height { get; set; }
[MaxLength(4)]
public string Size_Type { get; set; }
}
Globalism is Socialism on a planetary scale.
|
|
|
|
|
I think I'm going to throw that code in the trash and start again with a model to populate instead; split up the function into a load, calculate and then write.
Globalism is Socialism on a planetary scale.
|
|
|
|
|
I'd be inclined to normalize that table a lot more. You'd probably want something more like:
- Products - SKU, type, taxable, prices, weights, brand, department, vendor, etc.
- Customers - ID, account name, etc.
- Carts - ID, customer ID, etc.
- Items - ID, cart ID, product ID, quantity, price
Depending on your design, it might make sense to duplicate the product data on the lines for a processed cart. But "in-progress" carts should be referring back to the master product list.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
I went too far on duplicating fields in the shopping cart.
Now that I have the Automatic Migration working, I think I'll trim that down.
Globalism is Socialism on a planetary scale.
|
|
|
|
|
I shorted it down to this, and will just do a join to the master product table.
Would you include the price or leave that out as well?
public class SHOPPING_CART
{
[Key()]
[Column(Order = 1)]
[DatabaseGenerated(DatabaseGeneratedOption.Identity)]
public int cart_ID { get; set; }
[Required]
public int customer_ID { get; set; }
[Required]
[MaxLength(80)]
public string customer_accountName { get; set; }
[Required]
public DateTime timeStamp { get; set; }
[Required]
[MaxLength(80)]
public string item_Type { get; set; }
[Required]
public int item_ID { get; set; }
[Required]
public int item_Qty { get; set; }
}
Globalism is Socialism on a planetary scale.
|
|
|
|
|
Since you don't seem to have quantity breaks or any other complicated calculations, and you're removing the cart once it's processed, I'd be inclined to leave the price on the product.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Just out of the topic, but I am asking this here because I want to directly ask you, your caption says "Globalism is Socialism on a planetary scale", how, do you mean Socialism is nothing but imposing unwanted rules on mass people?, is it in that way?
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
|
|
|
|
|
I changed it to "21st century globalism is socialism on a planetary scale", makes more sense now.
I seriously doubt anyone cares why I think this now, and I would private message you why, but I think they removed the private message feature; can't find yours. And I don't want to violate the rules of the forum by posting my thoughts on this. But I do have a well detailed anwser for it.
21st Century Globalism has become Socialism on a planetary scale, in which the unequal treaties of the past have come back into play.
|
|
|
|
|
It doesn't matter for me my friend, its just your opinion as I will have mine, I respect your opinion but just asked what was your logic in it. We have both things in our genes that's to protect our territory and to cross territory, it came from animal genes that they were our fathers millions of millions of years back as per Darwin sir
Thanks,
Abdul Aleem
"There is already enough hatred in the world lets spread love, compassion and affection."
modified 6-Dec-16 20:17pm.
|
|
|
|
|
Sign Up | LinkedIn[^]
My contact info is here, just drop me an email and then I will delete this post
21st Century Globalism has become Socialism on a planetary scale, in which the unequal treaties of the past have come back into play.
|
|
|
|
|
I am writing a application that are going to run on small NAS servers using Java.
For that purpose I am looking for a nosql database that can be embedded in my application.
Another requirement I have is that is should be possible to connect to the database from remote clients.
Thats is, piece of cake. Beside those two requirements I am prepared to adapt my application to the database.
I know for example of MongoDB, Neo4J and OrientDB but have no experience of running them on small CPUs.
Any tips, tricks or recommendations.
//lg
/happy amateur
|
|
|
|
|
Create your own engine for minimal XML-based data sources. Your programs will read, write the data to these XML data sources and you can relax while it works on low-power CPUs too.
I personally use JSON for my regular day tasks instead of XML, but you can select either, then write an engine for that; CRUD functions and a bit of validation etc.
Create, read, update and delete - Wikipedia[^]
The sh*t I complain about
It's like there ain't a cloud in the sky and it's raining out - Eminem
~! Firewall !~
|
|
|
|
|