Skip site navigation (1)Skip section navigation (2)
Date:      Wed, 21 Feb 1996 16:26:10 -0600
From:      Jim Lowe <james@miller.cs.uwm.edu>
To:        mmead@Glock.COM
Cc:        hackers@freebsd.org, multimedia@star-gate.com
Subject:   Re: frameserv and clients
Message-ID:  <199602212226.QAA07493@miller.cs.uwm.edu>

next in thread | raw e-mail | index | archive | help
> 	Hmm.  I have vic on my system, but have not been able to get its
> quickcam patches to actually take pictures from my quickcam.  Unfortunately,
> Virginia Tech won't pipe MBONE out to the BEV Ethernet routers, so I can't get
> MBONE without someone tunnelling, and the closest person is, you guessed it,
> Virginia Tech.

I would probably try and debug the qcam stuff to make it work.
You can run all the mbone tools vic, vat, etc... unicast as well as
multicast.  You can also run them on your local network as multicast.
You will have the same restrictions if you use vic or write your
own application.

> 
> 	Hmm.  Ok, so when vic runs it is actually two parts?  It's a grabber
> program that knows how to snag frames, and a client program which knows how to
> get them from the grabber program?

Yes, you can look at it that way.  Vic grabs frames from video capture
cards on various platforms.  It has various capabilities including hooks
for hardware mpeg, jpeg, etc... encoding.  It can transmit these packets
to the network with unicast or multicast.  It is also a receiver of
this information.  Vic knows how to decode RTP packets and display them
in an X11 window.  I guess you could call it a full duplex video application
program.  The nice thing about vic is that it already uses IP and RTP
so you don't have to reinvent the API.  You can use RTP as your video API...


> 	Let me explain the main reasoning behind my development of frameserv
> and friends.  It's pretty simple actually - I want to have fast access to
> frames off the quickcam (or other capture devices), while also allowing
> *multiple* programs to get those frames at the same time.  Does vic's setup
> allow this or does it require exclusive use of the device you're grabbing video
> from?
> 
Vic (as well as your frame grabber) will require exclusive access to
the device to grab the frames.  You want to put the frames on the 
network so many things can read the same frames.  Vic does this.
You can use multicast or unicast to do this.  I suppose you can also
use the local loopback device.  Another method would be to use
shared memory -- but then you would need a shared memory network
extension for your machine (I think I saw one of these somewhere-mnfs?).

The major problem with grabbing frames is the amount of bandwidth
things comsume.  If you have a quickcam (greyscale device) with a
small frame size (160x120) it doesn't consume much bandwidth.  

	160x120, greyscale, 1 frame/second uses 19.2 kbytes/second.
	320x240, greyscale, 1 frame/second uses 76.8 kbytes/second.
	640x480, greyscale, 1 frame/second uses 307.2 kbytes/second.
	
	Multiply by 30 for real-time video (30fps), then by either
	2 for yuv 4:2:2 encoding or 4 for true color (possibly 3).

You will note that these numbers get real big real fast.  One will
need some sort of compression algorithm to deal with this.  Vic
already has h.261 and nv encoding and has been designed to deal with
hardware compression.  You can easily add whatever encoding algorithm
you wish to vic.  And it outputs something we all know about, namely
RTP.

My only point, and feel free to ignore me, is that a network frame
grabber is already available.  It has all the tools one needs to do
everything you described and it does much more.  By developing RTP
tools to work with it, you don't need to reinvent the wheel and there
may be other uses for tools you invent other than the ones originally
intended. 

	-Jim



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?199602212226.QAA07493>