|
With my Breakpoint.
1. Click on button "Edit info"
2. Say I must choose a leidraad to edit
3. Choose number 4 from listbox
On this code
Private Sub ListBox1_SelectedIndexChanged(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles ListBox1.SelectedIndexChanged
Dim SqlStr As String
SqlStr = "Select * from tblokkies where leidraad = """ & ListBox1.Items(ListBox1.SelectedIndex) & """"
FillTextBox(SqlStr)
End Sub
On F8
SqlStr = "Select * from tblokkies where leidraad = """ & ListBox1.Items(ListBox1.SelectedIndex) & """"
SqlStr ========"Sqltr "Select *from tblokkies where leidraad = "'n uitroep""
ListBox1.SelectedIndex)=========="Listbox1.Seletectedindex 3
FillTextBox(SqlStr) ========"Filltextbox Nothing
It then fills my textboxes with this info
4. info in this from row in my table ====Luidraad ='n uitroep" A1=gits, A2=aag,A3=SA (My textboxes luidraad,A1,A2,A3,)
Want to put info now in A4=test" (textbox A4)
4. Click on Save button.
On this code after F8
Private Sub btnAdd_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnAdd.Click
If SaveOrEdit = "Save" Then
Dim add As DialogResult
add = Cls_MessageBbox.Show("Is jy Seker jy will Leidraad" & vbCrLf & vbCrLf & TxtLuidraad.Text & vbCrLf & vbCrLf & "Byvoeg", "Byvoeg.", MessageBoxButtons.YesNo, MessageBoxIcon.Question)
If add = Windows.Forms.DialogResult.Yes Then
AddNew()
Else
ListBox1.Text = "Geen leidraad is Bygevoeg nie"
End If
ElseIf SaveOrEdit = "Edit" Then
Edit()
Else
StatusLabel.Text = " Geen Leidraad is bygevoeg"
End If
End Sub
If SaveOrEdit = "Save" Then =====Saveoredit ""
Then jumbs to
ElseIf SaveOrEdit = "Edit" Then =======Saveordeit ""
StatusLabel.Text = " Geen Leidraad is bygevoeg"=========Statuslabel.text "Leidradd :'n Uiroep: geselkteer"
After F8
StatusLabel.Text = " Geen Leidraad is bygevoeg"=========Statuslabel.text "Geen Leidraad is bygevoeg"
This is where it stops.
|
|
|
|
|
I understood that the btnAdd_Click handler is the save-button. Somehow it does trigger this for the first three textboxes (as you said they were saved correctly), but number four is out.
The variable SaveOrEdit hasn't got the value "Edit", which would trigger the Edit() function. The question remains why it works for the first three textboxes differently then for the fourth.
It 'feels' like there's something wrong with the program-flow, but I can't point where the pain precisely is
I are troll
|
|
|
|
|
How its possible to Connect the Vc+.NET application with webservice.
i want to Post the Values of Some fields on Submit button clicking To WebService( designed in c#.NET)
What type of project should i take either MFC or else plz guide
|
|
|
|
|
i'm wondoring which is better !!
1.openning the connection to the database and leaving it opened untill the application is closed.
2.closing that connection and openning it again when i a need to connect
,which means i will open and close the connection many times while the application is running !!
thanks in advance
|
|
|
|
|
Just a technical point: assuming that the connection is by the same program to the same file, the connection doesn't get closed and opened lots of times. Most database drivers use connection pooling, which caches the database connection. This increases the speed considerably
Personally, I would go with the second option. I don't write database programs often, but with the advent of connection pooling, the speed reduction is virtually eliminated. On the other hand, a program may terminate abnormally; an example of this would be the 'End Process' button in Task Manager. If that happens, I don't know if the connection would get terminated
This choice is just my opinion really. It may not be right, but it goes with my general method of retrieving data: get in, read it as quickly and efficiently as possible, get out, deal with the extracted data
|
|
|
|
|
thanks for ur reply ,
but i didn't get the meaning of connection pooling !
and what is the advantage?
i thought about it ,
i think if the program depends on a database which is on pc other than the application pc , it may be better if i use option 2 because of network traffic and security issues
beside if the database is on the application pc itself i would use option 1
that is my opinion and am not sure if it is better
|
|
|
|
|
Connection pooling basically caches the connection. It removes most of the overhead from creating a connection. Most of the database drivers use it. The main advantage of this is speed; there's no need to create a connection and read the data when the database driver's got some of the data already cached
|
|
|
|
|
oh yee i got it
it remembers me deferred execution Linq in .net3.5
deferred execution doesn't have connection pooling
thanks a lot
|
|
|
|
|
Member 4697348 wrote: it [connection pooling] remembers me deferred execution Linq in .net3.5
What has deferred execution got to do with connection pooling?
|
|
|
|
|
deferred execution doesn't keep data retrieved from database at RAM
so at every time you need somthing u will get the database data not the memory data
that is good when ur database is being updated a lot
which is an advantage of linq
|
|
|
|
|
That was not my question. My question was "What has deferred execution got to do with connection pooling?"
While deferred execution has the effect you describe for LINQ to SQL, deferred execution is not about keeping whether the data is in RAM or not.
Deferred execution is about running the LINQ query at the point you want the data from it, not at the point you define the query. If you run your LINQ query on objects that exist only in RAM then it will filter based on the most recent state of the objects rather than the state they were in when you defined the query.
|
|
|
|
|
yes ,
but that is for link to objects
but in link to sql the linq queries deals directly with database
with no lookong for data at RAm at first.
am i right??
|
|
|
|
|
EmZan wrote: but that is for link to objects
It is for all types of LINQ. I used LINQ to Objects as an example of how it works when all the data is in RAM. The principle is the same regardless of where the data is. That was my point. I was expanding your definition as it was too narrow and dealt only with LINQ to SQL when your previous post simply mentions LINQ (without detailing what it was LINQing to).
You have still avoided my question on connection pooling. What does deferred execution in LINQ have to do with connection pooling?
|
|
|
|
|
as i understand from the obove discussion
connection pooling :
to keep track of last queried data cashed so that i will no use the database engine to execute a prev. executed query.
and that is done using the DataAdapter Object(which fills the DataSet Object)
and that what i got from ur prev reply
btw DataAdapter is automatically closes the connection after execution(reading or writing)
Deffered Execution :
as linq queries the data and stores it using a varible , then ther is no need for DataAdapter(the pooling hand).
and i'm sure that that variable will be removed after reaing the data ,and the connection is closed.
so there is no relation between conn-pooling and deff. exec. as i understand
|
|
|
|
|
EmZan wrote: connection pooling :
to keep track of last queried data cashed so that i will no use the database engine to execute a prev. executed query.
Nope. It keeps track of the previously query CONNECTION only so it doesn't have to reestablish a connection to the database. This makes it faster to establish a new connection. Any queries are done against the database, even if it is the same query as the previous time.
EmZan wrote: and that is done using the DataAdapter Object(which fills the DataSet Object)
and that what i got from ur prev reply
I've never mentioned the DataAdapter. It is an evil thing.
EmZan wrote: DataAdapter is automatically closes the connection after execution(reading or writing)
Well, it says it closes the connection, but just like everything else it simply returns it too the pool.
Connection Pooling happens on the SqlConnection object which everything that connects to a database must use. LINQ or not.
|
|
|
|
|
thanks for clarification ,it's realy a good info
but i have many questions plz
1.why did u call the data adapter an evil
i'm always using it in my application(actually my 4 applications)i'm a junior btw
2.u said that any query is done against the database!
what i know is that dataAdapters keeps queried data and i use it many times without going to the database again and again for the same query
and if it's wrong , i'd say that i read before at a limq tutorial that reading the most recent data is an advantage
that it means that there is a technology that reads old data many times without going to the database , as i mentioned
|
|
|
|
|
EmZan wrote: 1.why did u call the data adapter an evil
i'm always using it in my application(actually my 4 applications)i'm a junior btw
Data Adapters are used to copy data into a DataSet or DataTable. These are large clunky constructs that almost always are under-utilised. So they end up taking more memory than they actually need for the job you use them for. The prefered solution is to take the data and put it in a domain model without the use of DataAdapters. LINQ to SQL goes some way to help achieve that easily. Previously you needed to use a DataReader (which a Data Adapters uses internally anyway) to get the data out and into your model.
EmZan wrote: 2.u said that any query is done against the database!
what i know is that dataAdapters keeps queried data and i use it many times without going to the database again and again for the same query
The Data Adapter dumps a disconnected copy of the data into a DataSet or DataTable. You are querying against that disconnected data. You never go back to the DataAdapter, you go back to the DataSet or DataTable. A DataAdapter's roll is to suck the data out of the database and make a copy of it. If it was a proper caching mechanism it would be relatively seamless. In otherwords you wouldn't know if it were cached or not.
|
|
|
|
|
thanks a lot , it's very useful information
|
|
|
|
|
This advice leaves me feeling rather ambivalent. On the one hand Colin appears to know what he's talking about, on the other he appears to be one of those dogmatic people who think that there is one way that is "the best" regardless of circumstances. I must emphasize that this is merely my impression, how I think the posts read, not a claim that Colin IS such a dogmatic person. (Perhaps he will reply and we will find out.)
Personally I think data adapters are useful and the disconnected data model can be enough for many things. Sure, if you have a fancy entity layer it would be ideal if the data can be persisted and reloaded as efficiently as possible, without any dataset intermediaries, but in many applications the truth is whether or not such an intermediary exists makes absolutely no difference to the value or usefulness of your application.
In my view, very few dogmas are of any use in programming. Sure, you should use StringBuilder and not string if manipulating string data, but even this makes little difference if the strings are small and modifications few.
|
|
|
|
|
dojohansen wrote: he appears to be one of those dogmatic people who think that there is one way that is "the best" regardless of circumstances.
I don't think I am. I mearly dislike DataAdapters/DataSets/DataTables. I'm find them incredibly cluncky for what they do. I much prefer to get the data into my business model as quickly as I can. I do appreciate that for quick throwaway applications DataAdapters can server a very good shortcut and I do use them on those occasion. But, for most things I think they are too unweildly and clunky.
If I am going for a purely ADO.NET approach I'd use a DataReader with a factory pattern to generate my objects. Alternatively I'd use an ORM like NHibernate.
dojohansen wrote: In my view, very few dogmas are of any use in programming.
That's true. Framework features exist for a reason and while I think that the number of use cases for certain features are quite low, there are times where I think they work best for the given circumstances.
|
|
|
|
|
We largely agree then. You're definitely right that using a reader and assigning the properties/fields of a class is faster to load than datasets. I just don't think the difference actually matters in all applications.
More importantly, data sets have some very nice features. They lend themselves to AJAX and web services rather well since their internal representation is XML, meaning they serialize and deserialize to and from XML very efficiently. You can save them to files and modify them in disconnected mode, such as on a laptop on the road, and easily sync back to some other store ("the central database") at a later time. You can filter the data and sort it very easily, add relations between entities of data dynamically, verify constraints, cascade deletes, and detect concurrency violations.
All of this is out-of-the-box functionality you get just by using datasets and adapters. If you want to get the same functionality but use your own custom data objects you run into a bunch of other constraints. Of course it is possible to recreate this sort of flexible and rich functionality in your entity objects, but trust me, if you do they will not be as lightweight anymore. To implement things like the DataTable.Select() method or DataViews and filters, or relations, you need to start including metadata in the classes or use reflection to discover it, and you then lose the raw speed of primitive data objects with hardcoded relations, as in obj.Name = reader.GetString("name"); . Sometimes this extra speed is necessary or at least highly desireable for an app to do it's job properly, but other times the user wouldn't even be able to tell any difference at all.
So for me, dismissing datasets and adapters isn't a decision to be taken before you've asked yourself - and answered - this question: How much of the functionality it offers is useful to me? How much might become useful down the road?
A specific example where datasets may be an excellent fit: You're writing an AJAX-enabled web app and have this idea: What if we simply use the same schema for the XML data on the wire as that used by the DataSet itself? We can have a client-side component (such as a table with in-place editing capability) effectively perform Insert, Update, Delete operations on the disconnected dataset without having to contact the server at all, neither by postback nor any AJAX callback. The UI would then basically just become a specialized XML editor.
After multiple edits have been made and are ready to be persisted, the client makes an AJAX request the body of which is the edited XML document, and the server simply creates a dataset from the XML and uses DataAdapter.Update() to delete, insert, and update as required, detecting concurrency violations if any.
Personally, I quite like the datasets. I just wish MS had done a better job with the typed datasets, because they completely ruined this functionality just by choosing naming conventions and a use of nested types that means writing ANY amount of code to use the generated stuff makes you cringe. I mean, seriously... it's stuff like this:
AdventureWorksDataSet.HumanResources_EmployeeTableRow r = AdventureWorksDS.HumanResources_EmployeeTable.NewHumanResources_EmployeeTableRow();
where it should have been
Employee e = DS.Employee.CreateNew();
For clarity, the reason I put DS.Employee.CreateNew() rather than new Employee() is just that this programming model could be achieved changing nothing more than naming and type nesting conventions for typed datasets, whereas getting to the preferrable (slightly - it's academical really, and makes no practical difference) new Employee() model would have required other changes, because DataRow has an internal constructor and can only be made by DataTable.NewRow(), obviously because the table contains the schema for the row.
Anyway, to me, the fact that the 'Employee' type is just a thin wrapper around a DataRow wouldn't really bother me in the least. I'd get a nice programming model that is way more OOP, but I'd also still have access to all the functionality that datasets already offer.
In fact, having written all of this I suddendly feel a bit tempted to make my own typed dataset generator!
modified on Wednesday, April 15, 2009 7:15 AM
|
|
|
|
|
dojohansen wrote: They lend themselves to AJAX and web services rather well since their internal representation is XML, meaning they serialize and deserialize to and from XML very efficiently
Sorry, but I have to disagree. They result in larger amounts of data needing to be transferred because of the MetaData overhead present, and they should not be used with web services because they are not interoperable. If you want a Java application to consume your service, there's a lot of code that's going to have to be written at the Java end because they have no idea what a DataSet is.
"WPF has many lovers. It's a veritable porn star!" - Josh Smith As Braveheart once said, "You can take our freedom but you'll never take our Hobnobs!" - Martin Hughes.
My blog | My articles | MoXAML PowerToys | Onyx
|
|
|
|
|
Larger amounts of data compared to what?
The alternatives are many, and JSON might be a great alternative in some ways, but the truth is that JSON has it's own problems and you'd have to write a bunch of code to deal with it.
And it's simply not true that DataSets "aren't interoperable". They're just XML and if using Java means "a lot of code" is "going to have to be written at the Java end" to pick out nodes from an XML document then Java is pretty weak. That said, I'd agree they're not the best choice for web services if (as ought to be the case) you intend the service to be as easily consumed by non .net clients.
However, it's not like entity objects solve this problem! If you don't implement ISerializable but try to go down the .net XML serialization route you can no longer encapsulate any of the objects state properly, because anything you'd like to serialize must be public and read-write. This is because .net actually uses reflection to generate code for a serializer, incurring reflection costs once, and then uses the compiled serializer to do the actual serialization. Just like compiled regex this gives great performance, but the problem of course is you can't serialize any properly designed OOP objects. I personally wish that the Serializable attribute would instead cause the automatic inclusion of an implementation of ISerializable within the same type, so that we could serialize and deserialize otherwise readonly or private members without incurring any reflection cost.
So I think you're getting off the hook to easily if we just let you point out what overhead is involved with one solution without saying anything about how it should actually be done. I am sure that no matter what you suggest as an alternative solution, it will have some drawbacks of it's own.
|
|
|
|
|
dojohansen wrote: They're just XML and if using Java means "a lot of code" is "going to have to be written at the Java end" to pick out nodes from an XML document then Java is pretty weak.
I'm going to defer to one of the industry experts on interoperability here (Christian Weyer) who advises against DataSets when dealing with non-.NET systems. Sure, it is easy for you, but for the guy on the other end who has to replicate the schema it is hellish. It isn't just about picking out a few XML nodes. When there is a response back you have to form a response that will deserialise into a dataset too.
|
|
|
|
|
Colin Angus Mackay wrote: I'm going to defer to one of the industry experts on interoperability here (Christian Weyer) who advises against DataSets when dealing with non-.NET systems.
Unless your only aim in this discussion is to create the impression that you are right and everybody else is wrong, why would you quote someone to say the same thing I just said, pretending us to disagree on something we do not?
I already said I agree it's not ideal for non .net clients, although I maintain that it would not in fact be much of a challenge for a moderately skilled developer to read this xml and turn it into objects native to the client. But please don't quote just that bit and pretend I don't agree that the WDSL should let his tools do that for him and he shouldn't have to concern himself with XML, because I do agree. It's still valid to point out that it wouldn't be "hellish".
|
|
|
|
|