|
I'm not sure if I agree with that. From what I can see that's the whole point of WebSockets, and SignalR which is built on it - to maintain a connection to the clients for e purpose of real-time communication.
|
|
|
|
|
"Real time"?
zephaneas wrote: From what I can see that's the whole point of WebSockets Depends on which part of the Dropbox client you want to recreate. It is not my opinion, but from Explorers' view it makes sense; your file's status is not relevant to the user until he requests that file.
Before it can be requested, the status is requested. Explorer will still show the files, just not the correct status initially. You can see this happening visually on a slow computer when the first overlay-icon is the blue refreshing-arrows when first opened, and than the actual status with the correct overlay-icon once the status is requested.
Now, real-time is reserved for anything that is updated within 1/24 of a second, as that is what the human eye perceives as real-time. I don't care what framework you use, if it is on Windows, it will be as realtime as the idiot that ran a marathon just to deliver a message.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
zephaneas wrote: From what I can see that's the whole point of WebSockets, and SignalR which is built on it
We either have a nomenclature issue or you are mistaken about what websockets do (I know nothing about the second.)
Communication involves two parts
1. Establishing the connection
2. Sending messages.
In normal communications, like web traffic a client (from any computer, any application) attempts to 'connect' to the server (any computer and server applications.)
Websockets allow a client to create a connection to a server and then facilitate message handling (2 in the above) between the client and the server.
A real callback requires a reverse of that connection protocol in that the server would then need to do 1 by attempting to connect to the original client. Websockets do not do that.
Some reasons for clients not doing real callbacks.
1. The server cannot in fact connect to the client. Although a client might have a route to a server the server is not likely to have a route to the client. Nor even know how to connect to the client. This is much, must more likely to be true on the internet.
2. Servers are intended to be static resources. Clients are temporary. Thus even if a server attempted a callback the client might no longer be there.
3. Establishing connections can be a resource intensive process as is handling connections. Asking a server to do both, when a client is likely connecting to the server often in the first place is a pointless waste of resources. Not to mention adding complexity to the system.
|
|
|
|
|
You should reconsider WCF. Yes; once you get the settings right, "save" them.
Other than that, we have lots of services communicating with different third parties exchanging multi-megabyte compressed payloads (shipping documents and label images) asynchronously from multiple locations.
|
|
|
|
|
I would really prefer to use WCF, but I can' seem to get past the exceptions I'm getting.
This weekend I'll post it here so we can continue this.
Thanks
If it's not broken, fix it until it is
|
|
|
|
|
My usual approach is to get a trivial case going and then expand upon it if I don't have a working solution already.
Getting Started Tutorial[^]
I've done the above previously and it works.
One area that does cause confusion is "test versus live"; where you're usually communication over HTTP versus HTTPS. Maintaining dual settings is an issue; but one can "clone" live endpoints and modify them on the fly for test endpoints; or vise-versa (you can't construct them from scratch AFAIK). That's if you're dealing with multiple servers. In other cases, the 3rd party may just use different credentials for testing with the same endpoint.
|
|
|
|
|
I followed this article[^].
It works ok as is, but trying to expand on it gives me fits.
See my post following this one where I outline the project requirements.
If it's not broken, fix it until it is
|
|
|
|
|
I learned early on to create a class library / dll for every 3rd party service; including the "corporate" one. Already had to swap out one vendor for another; took a few hours.
The sample you referenced should be refactored; too much UI code polluting the WCF code space.
|
|
|
|
|
I completely agree.. Like I said, coded as is it works fine.. When I refactor for my app I get all manner of strange errors. Mostly duplex related.
I'll try again and repost with error details.
Did you see my other post above? Curious on your thoughts.
If it's not broken, fix it until it is
|
|
|
|
|
We have a WPF/Winform standalone enterprise application and it has become an Elephant with lot of features.
This has impacted the performance immensely and adding/modifying anything is a big pain now.
The application is used to configure the hardware parameters and communicates two-way within local network.
The application follows MVVM architecure, however due to legacy code everything can't be changed to MVVM.
We are planning to revamp/rewrite the application and looking for the options available.
A typical usage of the application display around 1.2GB memory in task manager.
The application has canvas with rectangular objects displayed in a network fashion.
If we have around 200 objects and perform select all and drag/drop to some other location, it takes around 5 seconds and its not smooth.
Reason for slow application:
- Mix of WPF and Winform modules. When we create few objects, legacy code is in Winform which creates controls dynamically. As a result operation can't be pushed to background thread.
- Lot of Styles and Templates to give a good look and feel.
- Lot of objects stays in the memory. Some objects are duplicated for copy/paste functionality.
- Third party libraries
- Caliburn for MVVM : Allow easy DI, however as we have lot of objects, getting an object from huge collection of objects makes it slow.
- Infragistics
Technology in consideration:
Revamp:
- Web Services: Reuse the C# code and move it to some web service. WCF/Web API
- Windows Services: Keep most of the things in the windows service and make the WPF client thin
- Improve performance
- Make code asynchronous as much as possible.
- Optimize styles and templates
- Perform time consuming operations on the web services on the cloud
- Store objects in the database and use information from there.
Rewrite:
- Web based application using new technology stack such as MEAN.
Note: Team Expertise is in .Net/C#, however we are open to other technologies.
Question: Revamp OR Rewrite the application? Which technology stack to consider?
Looking forward for your valuable suggestion.
|
|
|
|
|
Praveen Raghuvanshi wrote: This has impacted the performance immensely and adding/modifying anything is a big pain now.
Either you are using the word "performance" incorrectly or there are two parts to that sentence which conceptually have nothing to do with each other. That said it is possible that the former is a result of the latter but that is likely an assumption without proof.
Praveen Raghuvanshi wrote: If we have around 200 objects and perform select all and drag/drop to some other location, it takes around 5 seconds and its not smooth.
Based on that and the rest of the description this suggests there is a design problem. And that is the cause of the performance issue. Could also be a requirements issue.
One doesn't solve design problems with technology but rather with better designs. And one definitely doesn't solve requirement bugs with technology.
As described the application footprint on a modern machine is trivial. Presumably this is being run on modern machines right?
Praveen Raghuvanshi wrote: Looking forward for your valuable suggestion.
Start by profiling the application while users are using it. That allows you to both determine what the actual performance problems are and also determine what real usage looks like.
|
|
|
|
|
Appreciate your thoughts on this.
|
|
|
|
|
Praveen Raghuvanshi wrote: Rewrite the application? You might want to read about what happened to Netscape Navigator. I'd recommend replacing parts of the old system, bit by bit, instead of replacing the entire app at once.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Thanks for sharing the strategy and quoting the story the of Netscape Navigator. Recently, I came across the developing the software using Microservices and I am planning to consider it.
Can you share a strategy for "Replacing parts of the old system, bit by bit"? I saw you have a good experience in C# and Winform. Maybe something you can share from that experience. Have you ever moved or thought of moving Winform application to WPF?
modified 28-Jan-16 9:57am.
|
|
|
|
|
Praveen Raghuvanshi wrote: Microservices and I am planning to consider it. If you can manage a lot of them, yes; otherwise you might end up with ravioli-code[^].
Praveen Raghuvanshi wrote: Have you ever moved or thought of moving Winform application to WPF? Done a few migrations, but for my own stuff I still prefer WinForms.
Praveen Raghuvanshi wrote: Can you share a strategy for "Replacing parts of the old system, bit by bit"? You rip out a form and replace it. Once that works, you move to the next form. Start at the forms that gets the most complaints or the one with the most active bugs in your issue-tracker.
Once you now how many screens, and how long it takes to migrate a single screen, you can make a guess about the total time that will be required.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Sounds Good. Appreciate your prompt response and enlightening on the subject
Praveen Raghuvanshi
Software Developer
|
|
|
|
|
You're welcome
|
|
|
|
|
Straight to the point:
I have a WinForms app I've written years ago that persists some settings to a local file. A subset of these would be useful to share across a few of my machines.
I'd like to have some of these settings persisted to something like OneDrive, so a copy of my app running on multiple machines have those settings available to them - change a setting once, and all the machines see the new value the next time they launch (assuming the user provides his OneDrive account credentials).
I've never looked into the OneDrive SDK, so I'm going to ask the question like a naïve user: Is OneDrive appropriate for this? Overkill? Square peg in a round hole? Ultimately I want to give OneDrive a simple string (say, XML) and have it take care of persisting it. Then any of my machines using the same account would have access to the same string. So what's the minimum that needs to be done, architecturally, to make this happen?
|
|
|
|
|
dandy72 wrote: Ultimately I want to give OneDrive a simple string (say, XML) and have it take care of persisting it.
That would fit more with a web server than a document store.
However looks like you could sort of do that based on what you mean by "give"
OneDrive Dev Center - Develop with the OneDrive API[^]
|
|
|
|
|
jschell wrote: That would fit more with a web server than a document store.
"Web server" is such a generic term.
OneDrive's infrastructure is already in place, scales, and its maintenance is essentially somebody else's problem. I just want go be able to log into a bunch of my machines, and the app I run on all of them needs access to settings that should be common among all instances that I've associated with that account.
Thanks for the link. It looks familiar, so I've probably come across it before. Something tells me I have an awful lot of reading material to go through just to decide whether this is something that makes sense to use or not.
|
|
|
|
|
I would assume that you can discover if the local machine has a onedrive mapped, supplying credentials to a mapped drive is trivial. If you own the onedrive you should be able to find a location and get at a settings file.
It would get a little more interesting if you need to get someone elses onedrive account to share a folder on your account!
As JSchell pointed out a WCF running on a web/intranet would actually be simpler.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I was thinking about a question[^] asked in QA..
Rather than mapping PCOC's/DTO's across multiple projects, I typically create an Entities project in my solution and reference it from all other projects that need it. One set of classes is much easier to maintain IMO.
So, given that, any reason not to make these classes all serializable as well as implement INotifyPropertyChanged?
I don't see any issues with this. Anyone else?
If it's not broken, fix it until it is
|
|
|
|
|
Kevin Marois wrote: I don't see any issues with this. If you can generate the bulk of the code or abstract it away, go ahead.
Might cost a bit more performance, might be raising an INotifyPropertyChanged without the code doing much with it, might introduce a bit overhead.. but would save time in having to implement it and/or update/change its implementation.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
This is exactly the design we use, including the OnPropertyChange event. The "models" project is then referenced by both the WCF and the WPF/Silverlight projects. Seems to work perfectly for us and we have more than 18 apps in production.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I think there is a tool in Visual Studio to help tester to do image and video capture of they step during manual testing. Could you help me to find more information about it?
I think I use bad keyword in Bing/Google because I find many information about debugging but nothing specific to VS.
Thank you,
|
|
|
|