|
ZurdoDev wrote: True, but this IS a discussion forum. I don't mind a discussion; but asking which to use when is not a discussion.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Darina Smartym wrote: When it's better to choose serverless? Everything we do at my company is severless because we don't want to have to deal with hardware in any way. We do, of course, have a laptop, but that's it.
The architecture you choose depends on what the business requirements are.
Social Media - A platform that makes it easier for the crazies to find each other.
Everyone is born right handed. Only the strongest overcome it.
Fight for left-handed rights and hand equality.
|
|
|
|
|
Got some time left, so revisiting this thread. YOU wanted the discussion.
ZurdoDev wrote: Everything we do at my company is severless because we don't want to have to deal with hardware in any way. We do, of course, have a laptop, but that's it. According to that idea, "calc.exe" has a serverless architecture, which is ofcourse, nonsense.
For any desktop app that doesn't communicate, "serverless" is a side-effect, not the main architecture.
One still has to learn about the pro's and con's of every option to make an (informed) choice.
CSLA is a nice option for WinApps, which isn't helpful for calc.exe either. Still works wonders for some applications.
Stuff becomes interesting if you have multiple options; imagine chat - could be serverless, ofcourse, but also could be using CSLA for the UI, and could be completely SOA. So how would you even call that?
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
I'm not sure if you're aware of the marketing buzzword version of "serverless" that's currently being pushed by cloud providers - mostly products that are quick and cheap to get up and running (not to say anything about ongoing costs and vendor lock in). Given the mention of SOA I'd guess the OP has been reading a lot of things about how to build "modern", "cloud-native" applications.
|
|
|
|
|
Dar Brett wrote: I'm not sure if you're aware of the marketing buzzword version of "serverless" that's currently being pushed by cloud providers I wasn't , and that isn't; if you use the cloud, you're depending on a server. Granted, not your own, but it is not "serverless". Peer-to-peer is.
Dar Brett wrote: Given the mention of SOA I'd guess the OP has been reading a lot of things about how to build "modern", "cloud-native" applications. A buzzword-magazine then, not a book on coding.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Eddy Vluggen wrote: I wasn't , and that isn't; if you use the cloud, you're depending on a server. Granted, not your own, but it is not "serverless". Peer-to-peer is.
You're preaching to the choir
Obligatory XKCD[^]
|
|
|
|
|
Q:
Does it make sense to have multiple resource files per culture, to organize things into logical groups (in opposition to one monolithic file per culture)? I'm thinking things like Labels<.culture>.resx, ValidationMessages<.culture>.resx, etc
Details:
Just getting into localization for the first time. (C#, MVC if that's important).
Created a little POC form where I had ResourceTest<.culture>.resx files as embedded resources. Decorated model properties with some data annotations to display the label for the field, a required field message, made the button text vary with the culture, simple things like that. That seems to be working o.k.
Thinking about applying localization to the entire application, it seems as if a single resource file per culture could get huge and unwieldy, so I thought it might be smart to have several .resx files per culture, to make it easier to find existing names/values. Not married to the idea of "functional areas" to group the data.
Doing some googling around, I couldn't find any best practice (or not) on this idea.
What say you, CP?
|
|
|
|
|
I'm using "embedded resources" and with data compression my "data" is less than 1/3 compared to uncompressed. Decompressing the "resource stream" at run time.
The resources were .net "content" objects that were created, then binary serialized, then compressed, then embedded.
The Master said, 'Am I indeed possessed of knowledge? I am not knowing. But if a mean person, who appears quite empty-like, ask anything of me, I set it forth from one end to the other, and exhaust it.'
― Confucian Analects
|
|
|
|
|
agolddog wrote: What say you, CP? I'd say you not the first to globalize an app.
Globalization | Microsoft Docs[^]
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
I believe that localization should take place where it's highly necessary. Multiple resources will only make things complicated. Keep it simple.
|
|
|
|
|
Hi ,
I have designed 4 layered C# application.
Few of the forms i need to load few dropdown in page load event.
In Leagacy way we put all select query in stored procedure and load all dropdown at a time.
for example
when load customer, products, cities in invoice page
Select * from Customer
Select * from products
Select * from cities
put all qauery in on SP then load all dropdowns in page load.
in oops design
GetCustomers
GetProducts
GetCities
all separate methods and when we call each method hits database call
3 times server call will happen. Is that right way or any other efficient way is available
please suggest me
Regards
Elavarasan
|
|
|
|
|
"Select *" is the lazy man's approach; expect your apps to perform the same.
The Master said, 'Am I indeed possessed of knowledge? I am not knowing. But if a mean person, who appears quite empty-like, ask anything of me, I set it forth from one end to the other, and exhaust it.'
― Confucian Analects
|
|
|
|
|
I differentiate between static/master tables and dynamic data. Cities I would load the first time it is required and cache the collection so it only loads once, not every time you hit the page. Customers I would expect to change more often and I may load it every time you hit the page.
I would never use a single SP to load multiple tables, I once tested it and it was dramatically slower than using a single proc per table.
Never underestimate the power of human stupidity -
RAH
I'm old. I know stuff - JSOP
|
|
|
|
|
Can it be done YES, is it appropriate probably not.
First I STRONGLY suggest you NOT use select * its really a lazy way to do things
Second if your going do do it you have to return a DataSet not a DataTable in your ADO Call
Then your proc would look like this
Create procedure dbo.myproc_GetParameters as
Select id,name from Customer
Select id,name from products
Select id,name from cities
Then set.Tables[0] will be customers set.Tables[1] will be products...
This is a LOT of data to pull at once and poor performance...
Your MUCH better off to pull them individually in separate calls
|
|
|
|
|
C. David Johnson wrote: This is a LOT of data to pull at once and poor performance... So, fetching the data in multiple calls increases performance?
C. David Johnson wrote: Your MUCH better off to pull them individually in separate calls Not if the reason is "just" to have individual separate calls.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Hello everyone.
For the past 3 years I have been a full time web developer that work in a corporate RnD team
I do know how to solve a problem or you can say the one that "just make things work" but most of the time, my solution is not very extensible or clear for whoever else that is also working on It
I would like to ask you guys how you goes through that phase in software development world and is there any specific things I could do rather than reading some articles?
Just some additional info : I use C# / Angular
|
|
|
|
|
It doesn't matter what languages or framework you use, it's the mindset you need.
Start by thinking about the past projects. How did they change? How did they evolve, what happened to the specification? What would have made it simpler to add those changes?
I work on a simple principle: design for change, design for maintenance.
The maintenance portion of a project is generally much longer than the "coding" part - and is normally done when the original project is no longer "familiar code" (either because it's been a year since you last looked at it, or you aren't the guy who wrote it) so design for that, rather than the immediate moment.
Don't do "clever code": it's a PITA to modify (or even understand six months down the line)
Do design for generic use instead of specifics: test it, add it to a library of "useful code" and use it again, and again for multiple projects.
Do design for problems: for example when I do a switch on an enum I always write this:
switch (myVariable)
{
default: throw new ArgumentException($"The value \"{myVariable}\" is not handled");
case MyEnum.First: ...
...
case MyEnum.Last: ...
} So if you add an option to the enum it is detected and you have to deal with it instead of doing nothing because it isn't in the switch list.
Do check your inputs: make sure data is correct as far as you can before you try to process it.
Defensive programming helps you to prevent problems.
But ... the biggest helper here is experience. Three years isn't that much, not really - keep thinking about what you are coding, why, and who for: you'll get there!
Sent from my Amstrad PC 1640
Never throw anything away, Griff
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
10,000 hours.
The Master said, 'Am I indeed possessed of knowledge? I am not knowing. But if a mean person, who appears quite empty-like, ask anything of me, I set it forth from one end to the other, and exhaust it.'
― Confucian Analects
|
|
|
|
|
I agree with the points mentioned by fellow members. I just wanted to convey my thoughts on OOP/Software Structure Knowledge and how experience can or in some cases cannot bring that knowledge. If I am working on a product continuously, I am getting better at the same software I am working on, but not necessarily becoming better at OOP/Software Design because I would not know what's outside.
You are lucky if you get to work on a variety of projects that have different development stages or framework etc on a daily basis then the experience working with diverse software will be invaluable to OOP/Software Design. But if you get to work on the same software (such as maintaining and minor enhancement), you could look at completely different language/framework and start making side projects. I find this approach very useful because when I start to learn new language/framework then I have to set my mind at '0' and then I get the way that I have not been working with. For example, working with Android Development in Java made me better at background tasks and multithreading which I rarely use in my regular .Net work and I started to use the techniques I learn in Android/Java in my .Net/C# development.
|
|
|
|
|
Hi all,
I would like to build an algorithm that gets as an input a SQL database (for example SQL Server or mysql) and returns the most Important tables/views/fields in it .
Now, important is described as follows:
1. Large tables (whereas Large is not not an absolute value , but rather larger in comparison to other tables)
2. Tables with many linked tables attached (for example orderHeaders orderDetails etc.)
3. Fields with large variance (fields that all values within are null or 0 NOT important)
4. Frequently changed fields ( If users updated table records yesterday and a day before it is important)
5. Fields with Meaningful text inside
6. more properties I'm not thinking about right now....
Ideally , I would like to tweak the importance of each property above and get different results
How would you approach and design this kind of algorithm ? since it involves data modeling , analytics and AI all together...
Any known service or APIs I can use to shorten development time ?
Thank you all in advance
Michael
|
|
|
|
|
Michael Sterling wrote:
1. Large tables (whereas Large is not not an absolute value , but rather larger in comparison to other tables) Like the list of country-codes, with their names?
Michael Sterling wrote: 2. Tables with many linked tables attached (for example orderHeaders orderDetails etc.) Like the list of country-codes?
Michael Sterling wrote: 3. Fields with large variance (fields that all values within are null or 0 NOT important) Tables with lots of null values are badly designed.
Michael Sterling wrote: 4. Frequently changed fields ( If users updated table records yesterday and a day before it is important) Do you keep such information?
Michael Sterling wrote: 5. Fields with Meaningful text inside I would love that; often you see a dot (.) for a required field in the db.
Michael Sterling wrote: 6. more properties I'm not thinking about right now.... Then they should not be on the list. If I order a car, I quote what I need, without adding the text that there is more coming that I don't know yet. Focus on what you do know, and you can extend it later.
Michael Sterling wrote: How would you approach and design this kind of algorithm ? since it involves data modeling , analytics and AI all together... I'd not use an AI, just to be able to say there is AI in there. Start with finding a way to get the amount of daily traffic from a record, and work from there
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
I thought it was a design & architecture forum ....
If you have nothing smart to say better not say at at all
|
|
|
|
|
Noted, and with pleasure
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
The trouble here is that you are going to be querying the database architecture and that will change between each database type. I would also rank the importance of the tests to reduce the amount of work your system is going to have to do. Assuming SQL Server.
1. You can query the system to get the physical size and or the row count of each table using SMO
2. Query the system views to get the FK count (and possibly the links to the 1 tables)
3 As Eddy said 0 or null fields indicate a badly designed database. This is going to be costly to query against the large tables (if it is the small tables you REALLY have a problem)
4. Again Eddy has it right, do you store/audit the change information.
5. Sounds like 3 all over again - extend the definition of meaningful
6. In your dreams - if you can design something for forward requirements you are better than the rest of us.
What you are proposing is a rules engine and they are excellent while the rule count is small, once it grows too large the entire thing becomes unsupportable. There are MANY commercial rules engines out there.
I would approach this by demanding the business case for such a tool, what are the benefits and who is going to support, extend and pay for it. When that is not forthcoming I would shelve the entire thing.
I can't see how AI would help here (my AI knowledge is zero) but it would be a major exercise for each database you are going to support. I would also break it into 2 major projects, the database querying project that will need to be extended with each one you support and the analytical project that should be generic, accepting data from all the database types.
Never underestimate the power of human stupidity -
RAH
I'm old. I know stuff - JSOP
|
|
|
|
|
Thanks for your answer . I like your thought process
BTW, what commercial rules engines are you referring to ?
|
|
|
|
|