|
I have my system date time format set as "dd-MMM-yy" which outputs datetime as {16-Jun-08 8:30:00 AM}
i want to make sure that no matter what the datetime set on system/server datetime always shows it as MM/dd/yyyy
i googled a lot with setting culture but none is pointing towards a solution, all are offering solution to convert to string and then change the format but i don't want to change to string instead i want to change the datetime variable output to a specific format
any help is apperciated
|
|
|
|
|
nitin_ion wrote: datetime always shows Where?
I'm not questioning your powers of observation; I'm merely remarking upon the paradox of asking a masked man who he is. (V)
|
|
|
|
|
whenever i try to get value from datetime it show system specific format.
this i want to change
|
|
|
|
|
Still not clear, but as you posted it in C# forum...
Custom Date and Time Format Strings[^]
I'm not questioning your powers of observation; I'm merely remarking upon the paradox of asking a masked man who he is. (V)
|
|
|
|
|
My system date date shows 14-oct-14 so when i use a datetime variable it also give date in the same format.
I want to change it to some other format 'en-US'.
I can do that by
DateTime.Now.ToString("MM/dd/yyyy"
) but then it is in string
I want the DateTime it self to return in the specific format and not as string.
This is because my SQL date format is different and when i want to compare dates on SQL then i get issue.
|
|
|
|
|
You missed it totally - DateTime is a binary format and for that it has no human-readable presentation. When checking it's value in the debugger the environment uses the default (system) settings to present you with something readable...
If you have problem with SQL date comparison you better show us your code so we can see the exact problem...
I'm not questioning your powers of observation; I'm merely remarking upon the paradox of asking a masked man who he is. (V)
|
|
|
|
|
nitin_ion wrote: I want the DateTime it self to return in the specific format and not as string
There is no such thing!
What you see on screen is ALWAYS a string representation of the data in a DateTime structure. There is NO FORMAT in a DateTime value. It is just a bunch of numbers that represents a Date/Time. The only way you get to see the value is if it is converted to some string format. That format is anything you chose.
Now, if you're looking in Visual Studio, it uses the systems culture information as a default format.
Now, if you stored your Dates in your database as a string, you SERIOUSLY screwed yourself. DateTime should ALWAYS be stored in the database as a DateTime type, NEVER as a string.
Why? Because because as a string representation, any DateTime comparisons or lookups will be done using character string rules, NOT DateTime rules. The two sets of rules are work very differently from each other.
|
|
|
|
|
There is no variable that is setting the "output format"; there is a culture-setting in Windows, and that is used to format dates if you do not specify your own formatting. If you want to change the setting application-wide, then you'd best set the correct culture for your application, effectively overriding the system wide Windows-setting.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
What people are telling you here is that what you "see" as datetime (in the debugger, in an application, in the database's resultset, ...) is just a representation of the datatype "datetime". Just as 10 is a representation of an integer and "10" is the string representation of the integer 10. Both the string and the integer value have a different "machine value" which is a bunch of 1's and 0's. Same goes for your datetime object. What you see is a representation of the object value which you can change as you wish.
Hope this clarifies things for you.
|
|
|
|
|
Hello,
I started to use VS 2013 profiler for the first time today.
Excellent tool at first sight.
But I get into a first issue while monitoring a XNA game.
I m trying to figure which mehod is causing the game to lag.
The profiler "functions tree" feature is apparently only doing a partial job.
It only displays a tree with 6 levels of called functions and claims that the 6th level function is at bottom of stack.
This is definitely not the case and this function does call some more methods.
Why does the profiler not show all of the called functions after the 6h level?
I looked for a parameter into the performance explorer properties. The only parameter that could be relevent was "display small functions". Enabling it did not help.
Any idea?
Thx in advance.
|
|
|
|
|
How possible to find source code of basic classes and methods to
override some methods, for instance override listbox methods (Items.Add)
to make some items other color?
Same as it is in Delphi, when you do ctrl+click and move to source code of pascal.
|
|
|
|
|
Don't post the same question in two places - all you will do is duplicate work and annoy people. Pick either the C# forum or the Q&A and stick with it.
Annoyed people are less likely to be helpful than happy ones...
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
|
That's the same link I sent him in QA!
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Great minds and all that.
|
|
|
|
|
You could also use a decompiler (like ILSpy) to browse the code of the framework.
The biggest difference with Delphi is that you can't change the code and recompile your VCL.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
Hi. I'm using RAPI2 trying to retrieve a value from a connected Pocket PC. I can retrieve types REG_SZ, but not REG_MULTI_SZ values.
Any clues?
using (RemoteDeviceManager mgr = new RemoteDeviceManager())
{
using (RemoteDevice dev = mgr.Devices.FirstConnectedDevice)
{
using (RemoteDevice.DeviceRegistryKey hKey = dev.DeviceRegistryLocalMachine.OpenSubKey(@"Comm\NPME\JEDI10_1\Parms\TcpIp"))
{
txtIPAddress.Text = hKey.GetValue("IPAddress", string.Empty).ToString();
}
}
}
|
|
|
|
|
|
Thank you Richard. Problem solved.
|
|
|
|
|
I've ported a game project from XNA using DirectX to Monogame using openGL and everything works fine except for the in game graphics.
Many of the models appear to be white with no texture. I suspect however that the textures are there but that all is simply overexposed.
I am not familiar with openGL or Monogame. Any help will be greatly appreciated.
|
|
|
|
|
paulrm wrote: Any help will be greatly appreciated. You need to be much clearer about what help you are asking for, no one can guess what your code is trying to do. Also, is this connected to C#?
|
|
|
|
|
I don't know how I could be any clearer. I have ported the game project from XNA using DirectX to Monogame using openGL. The game project and the content project both compile and the executable launches the game and it appears to run quite well.
The problem is, many of the models are rendered bright white (bleached out). This does not happen when the game is compiled with XNA using DirectX. Sorry I don't see any way to attach a screenshot.
I realize this is actually a question about Monogame openGL or XNA DirectX, but don't see a forum for either Monogame or XNA. This seemed the appropriate location since the game is written in C#.
Perhaps graphics would have been more suited to the question. Feel free to move the topic if you like.
|
|
|
|
|
What does your Game class look like? Specifically, what is GraphicsDevice.SamplerState[0] set to? Is it null? If not, what's it set to?
|
|
|
|
|
You said "many" models appear "... white ...". "Many" does not mean "all" models. I can only conclude that "some" models do render properly; in which case, I would see what is different about those models / textures. That at least gives you a start in trying to solve the problem.
|
|
|
|
|
Some models do render with color, but from what I see, it is the material color included within the model itself.
I don't think the textures are rendering. As I said in a previous reply, the textures are compiling.
|
|
|
|