|
I'm using the VMR9 with multiple graphs. This is being presented by a modified version of the MultiVMR9 allocator/presenter.
I basically create an additional swap chain to present the additional window on the second monitor.
|
|
|
|
|
Are you using the IVMRMonitorConfig9 interface on the renderer to set the playback monitor?
I ask because it sounds like it's DirectX related. I have no problem on multple monitors with
a custom renderer which uses GDI. It uses close to 0 CPU.
Mark
"Do you know what it's like to fall in the mud and get kicked... in the head... with an iron boot?
Of course you don't, no one does. It never happens. It's a dumb question... skip it."
|
|
|
|
|
Mark,
I've had no luck in getting SetMonitor to work, which is why I went to the MultiVMR9 sample.
The MS folks on the DirectShow forum, told me to use an additional swapchain or device, and I managed to get each way working, but they both still have the same problem with CPU usage when the window is created from my app on the secondary monitor, and then passed as an HWND to the MultiVMR9 allocator.
The MultiVMR9 is a complicated beast to work with, but so far, has proven the only way I can get this thing presenting to more than one monitor.
Here's what it kinda looks like:
Player 1 goes to a VMR9 renderer (has own small window on primary).
+ to the allocator
Player 2 goes to a VMR9 renderer (has own small window on primary).
+ to the allocator
Each of these are a separate renderer in the subgraphs.
Both original VMRs go to the MultiVMR9 DLL A/P, which renders onto a preview window on the primary display.
That is StretchRect'ed onto the additional swapchain in the allocator, which is the window I created on the secondary monitor in my app.
I require the separate graphs, as I need control over each of the input streams/files, for seeking, alpha, etc.
I've tried looking at this with Spy++, but see no messages coming through.
Totally befuddled, since I can look at other apps that do it...
|
|
|
|
|
Hmm your setup makes total sense, but the CPU usage doesn't (but you knew that already ).
If you set up your additional swapchain to go to a window on the primary display does the CPU
usage stay down?
"Do you know what it's like to fall in the mud and get kicked... in the head... with an iron boot?
Of course you don't, no one does. It never happens. It's a dumb question... skip it."
|
|
|
|
|
If I create the window on the primary monitor, CPU usage hovers around 15% while rendering the video, about what I would expect. When I create it on the second monitor, it hovers around 95%.
What's kind of weirder, is that it's using high CPU (70+%), even when not rendering or having a file loaded. At that stage, the window has already been created and handle passed to the A/P.
I'm not sure how Windows handles the secondary monitor internally, but obviously something is going on behind the scenes, and I don't see any messages being passed...
|
|
|
|
|
Yeah that's what makes me suspect a DirectX (actually, more specifcally, a Direct3D) issue.
Sounds like it's in the renderer stage, not the DirectShow part.
I'm no multi-monitor video adapter expert, but I remember reading something in the past about
some adapters that support 2 monitors use the onboard video RAM for the primary but if there's
not enough video RAM for the secondary it uses system RAM...I wish I could remember the
details
Mark
"Do you know what it's like to fall in the mud and get kicked... in the head... with an iron boot?
Of course you don't, no one does. It never happens. It's a dumb question... skip it."
|
|
|
|
|
I appear to have gotten somewhere...
By changing the adapter parameter from D3DADAPTER_DEFAULT to 1, in the CreateDevice, it seems to work much better (although still a bit more CPU than if on the primary).
I'm not sure I totally understand that though, since the CreateDevice is done on the preview window on the primary display, and the additional swapchain is what handles the secondary monitor...
Since the MultiVMR9 DLL is so complex, I think I'm gonna have a stab at writing my own custom A/P, and just incorporating it into my app, as opposed to using their DLL.
If I get it working well, I'll post it here as an article, so hopefully nobody else will have to go through this tangled mess
|
|
|
|
|
Again I use custom renderers implemented using GDI so the Direct3D stuff is vague to me
Did you mention, are you using 1 video adapter or 2?
I guess it's obvious the CPU is doing a bunch of work that normally the DirectX-enabled video
processor handles...
Also, are you actually mixing two streams into one VMR or just using multiple VMRs (I'm unclear
how you are using MultiVMR9 dll)?
For my software, I needed multiple viewers of individual media streams (filter graphs).
My solution was to create my own render filters (I use two - one for actual rendering, and one
is more of a framegrabber but implemented as a renderer filter so it completes a filter graph).
For each graph, I implemented it kind of like sockets (I call 'em video sockets).
The graph is built and run using the dumb renderer sitting on an infinite tee. I do that so the
graph can be running already when I attach viewers.
Each time I need a new viewer I just connect an instance of my renderer to the infinite tee
(kinda like a socket). I added a callback to the renderer so I can get a look at every frame
before it's displayed and perhaps mix something (or draw on) with the frame.
The rendering is done with simple old GDI.
The performance is great. With only one graph per stream, the CPU usage is whatever it takes to
capture and move frames given the frame rate and frame size. Adding more views doesn't add
any noticeable CPU (I test this every once in a while - open 20 or 30 video windows on multiple
monitors LOL).
Good luck - I'll look forward to the article!
Mark
"Do you know what it's like to fall in the mud and get kicked... in the head... with an iron boot?
Of course you don't, no one does. It never happens. It's a dumb question... skip it."
|
|
|
|
|
Hi there,
Consider you have a socket connected through an IP:port from server 1 (ip:port) to server 2(different ip:port (differenet out side ip)) if the IP address of server 2 is changed for some reason, what would happen to socket connection between two servers. the application that uses the socket connection from server 1 to server 2 is running whole time but server 2 is located in a place that the ip changes every hour. what happens to the socket connection?
Regards,
|
|
|
|
|
Hi,
Connection is disconnected because socket (address and port) is not valid any longer.
-----------
Mila
|
|
|
|
|
This is what I think but I have a colleague that says when the conncetion between two server is up throgh a socket, even though the ip changes for one server the connection should be there. And any new connection should use the new ip address not the one that is already there.
|
|
|
|
|
I just tried changing the IP on a LAN and the TCP/IP connection dropped.
"Do you know what it's like to fall in the mud and get kicked... in the head... with an iron boot?
Of course you don't, no one does. It never happens. It's a dumb question... skip it."
|
|
|
|
|
|
Hi,
I am having a CDBException in my code which access sql server databse from VC6.
CDatabase pData;
CString strConnectionStr="Driver={SQL Server}; Server=local;Database=Northwind; UID=sa; PWD=sa;";
CString strSqlQuery=" update Employees set city='LondonUK' where city='london'";
try
{
pData.Open(NULL,false,false,strConnectionStr);
}
catch(CDBException* dbExcep)
{
dbExcep->ReportError();
return 0;
}
try
{
pData.ExecuteSQL(strSqlQuery);
}
catch(CDBException* dbExcep)
{
AfxMessageBox(dbExcep->m_strError);
switch( dbExcep->m_nRetCode)
{
case AFX_SQL_ERROR_API_CONFORMANCE :
AfxMessageBox("AFX_SQL_ERROR_API_CONFORMANCE");
break;
case AFX_SQL_ERROR_CONNECT_FAIL:
AfxMessageBox("Connection to the data source failed");
break;
case SQL_INVALID_HANDLE:
AfxMessageBox("Sql invalid handle");
break;
default :
AfxMessageBox("no result");
}
return 0;
}
in switch, result is SQL_INVALID_HANDLE.
Plz tell me where is the problem.
Cyber Friend
|
|
|
|
|
What is the error message? I see you display one of te strings in a messagebox.
What's the message?
Try adding this to your catch block and see the error details...
TRACE(_T("** %s \n"), dbExcep->m_strError);
TRACE(_T("** %s \n"), dbExcep->m_strStateNativeOrigin);
Also don't forget to delete the exception object
dbExcep->Delete();
Mark
"Do you know what it's like to fall in the mud and get kicked... in the head... with an iron boot?
Of course you don't, no one does. It never happens. It's a dumb question... skip it."
|
|
|
|
|
Hi,
Thanx mark for your reply.
Mark there is no error msg , both strings are empty, i have checked them after debugging too. I can determine the error only from dbExcep->m_nRetCode in switch statement, which shows this msg.
case SQL_INVALID_HANDLE:
AfxMessageBox("Sql invalid handle");
break;
One important thing is that I recieve a "Debug assertion failed" when i try to delete dbExcep on this line.
<br />
delete dbExcep;<br />
Cyber Friend
|
|
|
|
|
Cyber Friend wrote: there is no error msg , both strings are empty,
That's awfully helpful of the driver
Oh well, glad you found it!
Mark
"Do you know what it's like to fall in the mud and get kicked... in the head... with an iron boot?
Of course you don't, no one does. It never happens. It's a dumb question... skip it."
|
|
|
|
|
Hi,
I have solved the problem by using CDatabase::OpenEx(strCon).
Thanx to all who participated in this thread.
best regards,
cyber friend
|
|
|
|
|
Cool
"Do you know what it's like to fall in the mud and get kicked... in the head... with an iron boot?
Of course you don't, no one does. It never happens. It's a dumb question... skip it."
|
|
|
|
|
Cyber Friend wrote: CString strSqlQuery=" update Employees set city='LondonUK' where city='london'";
No Sure though , try modifying this to,
CString strSqlQuery=" update Employees set city= \"LondonUK\" where city=\"london\"";
-- modified at 0:22 Friday 23rd February, 2007
In fact it doesn't seem to be problem. But this particular error comes due to programming error,as MSDN indicates.
|
|
|
|
|
Hi,
Parasad thanx for your reply.
I have tested the same query on an Access database with connection string for access.I am sure that this query is ok bcoz i have tested it in Query analyzer for SQL Server too.
Best regards,
Cyber Friend.
|
|
|
|
|
hi want change virtual key code from ascii(65) to ascii(235) in my windows hook dll.
i able to do upto ascii(127) i.e. 7 bit but not 8th bit. can any one help how to do so.
i am also looking solution for ascii to unicode.
thank you
viral
|
|
|
|
|
viral_umang@hotmail.com wrote: hi want change virtual key code from ascii(65) to ascii(235) in my windows hook dll.
i able to do upto ascii(127) i.e. 7 bit but not 8th bit. can any one help how to do so.
I don't know what you are trying to do but it looks like you are using a char instead of an unsigned char. A char is signed so it goes up to 127 and not 255.
|
|
|
|
|
hi,
thank you very much to reply so fast.
i am from india, i have created keyboard hook for changing the keyboard's key's from
say 'a' to 'b' so, i am able to get 'b' when i press key 'a' of my keyboard.
I am able to do this by changing the vitrual keycode in my hook procedure.
so i want now is that i should get a font's symbol at 235 but i am not able to get the
virtual key code for ascii 235.
thanks,
Viral
|
|
|
|
|
Well, you don't provide a lot of information do you ? Did you check what I said in my previous post ? Can you post the code because it is impossible what you did wrong by just guessing it.
|
|
|
|