|
You start by researching the requirements of the accounting software. Once you know the requirements you can start the actual design: inputs, outputs, data storage etc., etc. As a beginner I would suggest you would probably be better working on some smaller projects first.
|
|
|
|
|
Downvote countered.
This space for rent
|
|
|
|
|
|
As Richard said, you would start off by researching the requirements for accounting software. Long before you type your first line of code, you should understand what it is that you are trying to develop. Do you have an accountancy background? If so, that will help you. If not, you are going to have to talk to accountants to get an understanding of what they require. To give you an example of why this is necessary - consider the case of Value Added Tax (VAT). Will your system be able to cope with VAT added at source? What about VAT that gets applied at destination. Can you cope with different VAT rates? What about the ability to have VAT discounts (where VAT is applied at one rate for part of the year, then another rate for the rest)?
This space for rent
|
|
|
|
|
What be this VAT thing you talk about.
And that is the point, is it going to be an international package or just for his home country.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
And gawd help his "customers" ...
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Or, for domestic customers who also trade internationally?
This space for rent
|
|
|
|
|
Member 11580781 wrote: how to do this task Building accounting software is not a task but a project full of thousands of tasks. Define your tasks and then build it.
There are only 10 types of people in the world, those who understand binary and those who don't.
|
|
|
|
|
What everyone is trying to say is that it sounds like you're in way over your head.
Writing an accounting software is not an easy project and would take several developers months to get a base version and years to get a mature market competitive version. It would require in dept knowledge of different domains like database, accounting, juridical laws, development, IT hardware, ... etc...
try finding some information on existing software and check out what they can do. That will give you an idea why not doing this...
|
|
|
|
|
public class Account {
public string AccountNumber { get; set; }
public string AccountType { get; set; }
public string AccountName { get; set; }
public decimal TotalDebits { get; set; }
public decimal TotalCredits { get; set; }
}
|
|
|
|
|
Hello I need sample code to show dual screen display using usercontrol.xaml in wpf c#
I need to display a part of screen from primary monitor to another screen
|
|
|
|
|
And?
What have you tried?
Where are you stuck?
What help do you need?
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
System.Windows.Forms.Screen s1 = System.Windows.Forms.Screen.AllScreens[0];
ticket.Visibility = Visibility.Visible;
System.Drawing.Rectangle r1 = s1.WorkingArea;
ticket.Height = 50;
ticket.Width = 150;
its not showing in another screen part of usercontrol.xaml name is ticket
|
|
|
|
|
Ummm... did you set the position of the window? It doesn't happen by magic.
|
|
|
|
|
I need to display a part of usercontrol.xaml in another screen ie dual screen in wpf c# any sample
|
|
|
|
|
|
iTextSharp.text.Document Doc = new iTextSharp.text.Document(PageSize.LETTER, 20, 20, 20, 20);
string PDFOutput = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Output.pdf");
PdfWriter writer = PdfWriter.GetInstance(Doc, new FileStream(PDFOutput, FileMode.Create, FileAccess.Write, FileShare.Read));
Doc.Open();
string Folder = "C:\\Images";
foreach (string F in System.IO.Directory.GetFiles(Folder, "*.jpg")) {
Doc.NewPage();
Doc.Add(new iTextSharp.text.Jpeg(new Uri(new FileInfo(F).FullName)));
}
Doc.Close();
|
|
|
|
|
Don't repost your question in multiple forums. Pick ONE and stick with it. You already posted this in QA.
|
|
|
|
|
Hey Gang...I've got a folder with about 17,000 text files in it. I need to parse those text files and insert my parsed results into a Sql database. I've gotten it to where I'm doing about 4 - 5 files per second, but I need it to be faster that that. The text files aren't straight forward either. An example of the data in the text file is:
orld Clock Location Entries=INXX0102|Pune|India
Weather Location ID=48226|Detroit| MI (48226)
English/Metric Units=0
CLNAME001=
CLNUMBER001=xxx-xxx-xxxx
CLTYPE001=4
CLDATE001=11/09/16
CLTIME001=18:07
CLDURATION001=
CLBRIDGEDFLAG001=0
CLMISSEDCNTR001=1
CLBCALBL001=
CLNAME002=
CLNUMBER002=xxx-xxx-xxxx
CLTYPE002=4
CLDATE002=11/09/16
CLTIME002=17:59
CLDURATION002=
CLBRIDGEDFLAG002=0
CLMISSEDCNTR002=1
CLBCALBL002=
CLNAME003=
CLNUMBER003=xxxxxxxxxxxx
CLTYPE003=3
CLDATE003=11/09/16
CLTIME003=16:57
CLDURATION003= 1:54
CLBRIDGEDFLAG003=0
CLMISSEDCNTR003=0
CLBCALBL003=
etc......
This is a backup text file of an AVAYA 96xx phone. What you see above is 3 calls from the phones call history. There's more in that text file than just calls though, so to get just call info, I grab all the lines that start with "CL".
Here's a blurb of my code:
while ((line = file.ReadLine()) != null)
{
if (line.Substring(0, 2) == "CL")
{
try
{
string[] strArray = line.Split("=".ToCharArray());
string key = strArray[0];
string str = strArray[1];
One call is made up of 9 elements, so:
CLNAME, CLNUMBER, CLTYPE, CLDATE, CLTIME, CLDURATION, CLBRIDGEDFLAG, CLMISSEDCNTR, CLBCALBL all make up one call. My question to you is, how would you go about parsing this out and inserting into a database? Am I going about it the right way?
Below is my complete code:
public static void letsdoit(SqlConnection con)
{
string[] files;
string line;
int counter = 0;
using (UNCAccessWithCredentials unc = new UNCAccessWithCredentials())
{
if (unc.NetUseWithCredentials(@"\\ql1telutil1\c$\inetpub\wwwroot", "xxxxxxxxxx", "xx", "xxxxxx"))
{
files = Directory.GetFiles(@"\\ql1telutil1\c$\inetpub\wwwroot\backup96XX");
foreach (string f in files)
{
sqlString = null;
int myCounter = 0;
List<Int32> myCountList = new List<Int32>();
List<Int32> UniqueCallNumber = new List<Int32>();
System.IO.StreamReader file = new System.IO.StreamReader(f);
string[] array2 = f.Split("\\".ToCharArray());
string myStat = array2[7].ToString().Substring(0, 5);
log.Info("Getting history for extension: " + myStat);
while ((line = file.ReadLine()) != null)
{
if (line.Substring(0, 2) == "CL")
{
try
{
string[] strArray = line.Split("=".ToCharArray());
string key = strArray[0];
string str = strArray[1];
sqlString = sqlString + MinifyB(str.Trim()) + "','";
myCounter = myCounter + 1;
if(myCounter == 9)
{
try
{
addSQL =
"INSERT INTO tblStationCallHistory(CLSTATION, CLNAME, CLNUMBER, CLTYPE, CLDATE, CLTIME, CLDURATION, CLBRIDGEDFLAG, CLMISSEDCNTR, CLBCALBL) " +
"VALUES('" + myStat + "','" + sqlString.Substring(0,sqlString.Length - 2) + ")";
SqlCommand updateCMD = new SqlCommand(addSQL, con);
try
{
updateCMD.ExecuteNonQuery();
}
catch (Exception ex)
{
log.Error("There was a problem executing the command. " + ex.Message);
}
}
catch (Exception ex)
{
log.Error("There was a problem inserting the coverage path into the table. " + ex.Message);
}
sqlString = null;
myCounter = 0;
}
}
catch (Exception ex)
{
string myError = ex.Message.ToString();
}
}
counter++;
}
file.Close();
}
}
}
}
Any hints, guidance etc to help me speed this up would be greatly appreciated! Thanks for any help!!
Dave
|
|
|
|
|
Just a quick hint: please consider using bulk inserts through parametrised SQL queries as they speed up the inserts, f.i a bulk insert in SQLite can insert 100 000 records in 3-5 seconds. I am referring to SQLite because I tested it with SQLite, but AFAIK they speed up every database inserts ...
Good luck
|
|
|
|
|
So, how many "exceptions" are you generating? Each exception will add some overhead; you should be attempting to improve the parsing so it generates fewer exceptions.
(For those that do not believe exceptions add overhead: I DON'T CARE WHAT YOU THINK; SAVE IT FOR SOMEONE WHO DOES).
Starting out, you should stick to timing the parsing; at this point, you don't know if parsing is the problem; or the database inserts; or something else.
|
|
|
|
|
Hey Gerry...thanks for the reply! No exceptions yet. I stopped the code after about 1600 text files, and it didn't generate any errors. 1600 text files was around 130,000 records. I commented out the insert statement, and I was flying through all the text files. 24 per second. So, I'm guessing that means my insert command is the problem. I believe they prohibit bulk inserts here, I'll have to try again and see if I'm able to do that. I'm also thinking of maybe inserting multiple records with 1 insert statement? If I could get it to about 10 per second, I'd be happy with that I think.
Thanks again!
Dave
|
|
|
|
|
davers wrote: I believe they prohibit bulk inserts here
Does that mean you can't prepare a DataTable and send the lot via a DataAdapter? That doesn't use the SqlBulkOperations class, but it does mean that the insert becomes one message in each direction instead of message-response-message-response-... which is going to be significantly slower as it's a new command parse and construct each time at the SQL end.
Performing Batch Operations Using DataAdapters[^]
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Glad you've narrowed it down, Dave.
As others have suggested, if you can do a BULK INSERT from a file, then parse all your data first into a single (csv) file and pass that to BULK INSERT. Very fast.
|
|
|
|
|
Indeed, parsing is usually faster than inserting each record separately using SQL. I did a quick search right now and this [^] resembles a possible approach using parametrized SQL, but it is also possible without explicit "BULK" operation, just a parametrized insert query would do (wrapped in a a SQL transaction...)
Cheers,
|
|
|
|