I was checking out some posts on sandboxing and noticed the mention of NSTask as an option for launching apps.
One question related to NSTask I've been putting off for a while is handling piped shared buffers. Have you implemented pipes like these for communication? (in my case, for receiving video from the shell binary).
I want to integrate mplayer2, which allows for piped shared buffering to Cocoa apps, but while I can launch the player I can't understand how the pipes are supposed to communicate and how to display the incoming video, afterwards.
This is currently done like this in Cocoa (ObjectiveC):
http://jongampark.wordpress.com/2010/03 ... e-program/(post is about controlling the unix binary, which is useful as well, but explains how the piping is handled)
(pre-built mplayer2 standalone for this can be found here, for testing:
http://code.google.com/p/mplayerosx-bui ... p&can=2&q=)
Normally mplayer2 could be launched with parameter -vo corevideo for normal play or -vo shared_buffer:buffer_name=mybuff* for playback through a shared buffer suitable for piping.
http://www.mplayer2.org/docs/mplayer/*(mybuff: Name of the shared buffer created with shm_open() as well as the name of the NSConnection mplayer2 will try to open, from the docs)
Back in 2005 Ken Mankoff (who's gone off to live a life of mapping glaciers with Kinect hardware, incidentally) came up with some libraries that managed to use sh_mem and its ilk to read shared memory spaces like what's needed here (you can see the comment here:
https://monkeybreadsoftware.eu/listarch ... 0-19.shtml )
He published a sample project and library but that specific website was lost in 2007.