[Ffmpeg-devel-irc] ffmpeg.log.20131023

burek burek021 at gmail.com
Thu Oct 24 02:05:01 CEST 2013


[00:00] <Eduard_Munteanu> It's much easier to just sync to a synced wall clock for displaying loops.
[00:02] <Eduard_Munteanu> (AFAIK there's no such project, everything seems to be using explicit sync network traffic)
[00:05] <burek> NTP-synchronized playback ?
[00:06] <Eduard_Munteanu> burek: you sync each player's clock via NTP, and have them agree on a time reference, e.g. Unix epoch. Then if you keep playback synchronized to the clock modulo loop length, you get synced playback without explicit network communication.
[00:07] <burek> hmh
[00:07] <burek> what is the real-world usecase for that?
[00:07] <Eduard_Munteanu> IOW you assume the loop started at Unix epoch and repeated until today, it's easy to compute what you should be playing now.
[00:07] <Eduard_Munteanu> burek: displaying ads
[00:08] <burek> on a network stream?
[00:08] <Eduard_Munteanu> burek: no, each player has files stored locally. They are distributed over a network, but not streaming.
[00:09] <Eduard_Munteanu> Mainly because streaming is unreliable and traffic-hungry.
[00:09] <burek> couldn't you just make a cronjob / scheduled job for that and sync the system clocks prior to that?
[00:10] <Eduard_Munteanu> burek: no, they go out of sync rather quickly. It's quite annoying if you have a lot of TVs or other devices in the same room.
[00:10] <burek> sync them more often
[00:11] <burek> it's less intensive operation to sync your clocks more often than to create a synced playback the way you described it
[00:12] <Eduard_Munteanu> burek: depends how often. NTP itself isn't very high-traffic, and it can use a local time source, it doesn't need to be accurate.
[00:13] <Eduard_Munteanu> I got down to 3ms lag on the same machine, but a couple tens of ms is very reasonable for a LAN, even wireless.
[00:14] <Eduard_Munteanu> Lag as measured vs wall clock. And this involves adjusting the playback speed as well for fine tuning and making it stable.
[00:14] <Kuukunen> Eduard_Munteanu: btw, there's ffmpegsource that's meant for easier api... dunno where it's going right now tho
[00:15] <Eduard_Munteanu> Kuukunen: hm, I'll have a look, I was implementing this in Haskell though. I still wonder if I can just pipe stuff to it :)
[00:15] <burek> wait, you are playing offline files and the only thing that is not working correctly is that your ads don't display at the correct time or even the background video starts to go out of sync between displays?
[00:15] <Eduard_Munteanu> burek: yes, it has to be synced between players
[00:16] <burek> why dont you create a multicast stream then
[00:16] <Eduard_Munteanu> burek: because it's traffic-hungry and it doesn't really work on usual wireless hardware.
[00:16] <Kuukunen> Eduard_Munteanu: also, you could just run a video player with some sort of playback control api... I think vlc as one?
[00:16] <burek> err... no it's not
[00:17] <Eduard_Munteanu> Kuukunen: that's what I was doing with mplayer, controlling the playback speed through the slave socket.
[00:17] <burek> you have only 1 stream going over your network and when the stream reaches your multicast point (usually your LAN switch/router) it gets multiplied and sent to each player/client that requested the stream
[00:17] <Eduard_Munteanu> burek: I want it to work for 1080p, high quality H264. Besides, it's unreliable.
[00:18] <burek> ok
[00:18] <burek> so you want your TVs (or whatever) to only show ads? or is there other content too? is that other content also offline?
[00:18] <Eduard_Munteanu> burek: that might work on wired LAN but it doesn't really work on commodity wireless APs. Many of them just freeze or fallback to a *very* low data rate.
[00:18] <Eduard_Munteanu> burek: nah, only ads... all content stored locally.
[00:20] <Eduard_Munteanu> Besides I find streaming wasteful, there's more than enough storage. :)
[00:21] <burek> well in that case, you can try to sync your devices each minute and create cronjobs on each device to play your ads at a given schedule/program (even better, make your devices download the "program" from a server each morning).. that would make it all run at the same time, given that the hardware is the same (it takes the same amount of time to load/run ffmpeg)
[00:22] <Eduard_Munteanu> burek: every minute is way too little, it becomes noticeable if they don't sync even for 5s.
[00:22] <burek> then your hardware is maybe faulty?
[00:22] <burek> if your clocks drift that much
[00:22] <Eduard_Munteanu> Think about it, one or two frames ahead is acceptable, but not more.
[00:23] <Eduard_Munteanu> That's already like 80ms.
[00:23] <burek> btw, if you created your own player, why dont you just send an udp broadcast packet as a signal to all the players to play the content at that specific moment
[00:23] <Eduard_Munteanu> burek: no, they just drift, especially hw where I don't choose drivers.
[00:23] <burek> to avoid constant syncing
[00:24] <Eduard_Munteanu> burek: because I'd essentially replicate NTP :)
[00:24] <Eduard_Munteanu> And it's unreliable and it's a lot of traffic, I found that out when testing mplayer's UDP sync.
[00:25] <burek> not quite.. this way you dont care about the clock syncing... you just listen to an udp port and trigger the playback when the packet arrives, that's all
[00:25] <Eduard_Munteanu> burek: oh, just for control? Yeah, that'd work.
[00:25] <Eduard_Munteanu> I still need to sync though.
[00:25] <burek> but i hope you are not telling me that, when you press play on two devices at the same time, that the content gets out of sync until the end of the commercial?
[00:26] <Eduard_Munteanu> burek: they do
[00:26] <burek> how is that possible :)
[00:27] <Eduard_Munteanu> I figure many players don't care about drifting as long as the average playback speed is correct.
[00:27] <burek> well any decent player should respect the timestamp info in the stream and display frames accordingly
[00:27] <burek> if your system clock drifts for such a short period of time that much... man..
[00:27] <burek> you're screwed :)
[00:28] <Eduard_Munteanu> burek: AFAIK they do, but they use an *internal* clock source
[00:28] <Eduard_Munteanu> They don't care about the wall clock.
[00:28] <Eduard_Munteanu> e.g. if the video twitches, it resumes playback as soon as possible, it doesn't care about any reference clock.
[00:29] <burek> so, your ads are so short, but even in that case, it is possible for 2 devices to go out of sync..?
[00:30] <burek> you could send udp sync packet at the start of each ad so the players end the previous playback (if any) and immediately display the new ad... i guess..
[00:30] <Eduard_Munteanu> burek: yes. Each is usually 30s long.
[00:30] <burek> but if they go out of sync even for the duration of a single ad.. well...
[00:31] <Eduard_Munteanu> burek: I do what mplayer's syncing algo does, seek above a threshold, vary the playback speed under it.
[00:31] <Eduard_Munteanu> burek: mind some of those devices might not have hw acceleration, or not optimally configured.
[00:32] <Eduard_Munteanu> So if the video displays mostly correctly, even if the jitter is a tad bit high and it's lagging a bit, I still want to use that device.
[00:33] <burek> that's just a weak device or wrongly configured... it might be better to resolve the cause of the problem rather than the consequence..
[00:33] <burek> but i get the idea
[00:34] <burek> it's quite interesting scenario :)
[00:35] <Eduard_Munteanu> burek: there's visible lag even simply because TVs have different processing lag
[00:36] <Eduard_Munteanu> burek: HDMI splitters add some too. So in any case, I want to add as little as possible.
[00:37] <burek> so, all you need is to write your own player that gets synced frequently and seeks/skips frames accordingly :)
[00:37] <Eduard_Munteanu> Yep. I already have one working, but I don't really want to depend on mplayer's socket.
[00:38] <Eduard_Munteanu> Also because I want more logic in playlist management.
[00:40] <burek> well, since you cant get a lot of help from ntp
[00:40] <burek> (because it also takes time to update/sync the clock)
[00:40] <Eduard_Munteanu> I should also say NTP isn't great either, so that might add a delta of a few tens of ms theoretically.
[00:41] <burek> maybe you could create a remote beacon source, which would send timestamps over the udp broadcast and sync the players
[00:41] <Eduard_Munteanu> burek: not a lot, with the right settings it gets into a usable margin rather quickly
[00:42] <Eduard_Munteanu> I don't expect an open loop system like that to work right.
[00:42] <burek> why not
[00:42] <Eduard_Munteanu> burek: wireless alone adds rather large and not very deterministic lag
[00:43] <burek> yes, but when you send the broadcast packet, it gets 255.255
[00:43] <burek> so its a single packet
[00:43] <burek> received at the same time by all devices
[00:43] <Eduard_Munteanu> Think about it, even NTP isn't really great under 10ms or so.
[00:44] <burek> and you dont need to send it often.. say, 5 seconds interval would be enough
[00:44] <Eduard_Munteanu> Wireless hardware doesn't really work like that, it depends on a lot of settings.
[00:45] <Eduard_Munteanu> burek: and mind broadcast/multicast modes aren't well supported on many APs.
[00:45] <Eduard_Munteanu> Some freeze if you try to send a 1Mbit multicast stream.
[00:45] <burek> so, if i get this right, you can't properly sync devices over your wifi
[00:45] <burek> so, what's the use of it then
[00:46] <Eduard_Munteanu> Yeah, wifi was a big reason for this.
[00:46] <burek> wait, why would you have 1mbit broadcast/multicast?
[00:46] <bencoh> well, 1mbps isn't much
[00:46] <burek> i am talking about a single udp packet, containing the current timestamp
[00:46] <bencoh> so, why wouldnt you ?
[00:46] <Eduard_Munteanu> burek: because they suck at multicast :)
[00:46] <burek> it's less than 32 bytes
[00:47] <burek> it practically gets broadcasted instantly, even on wifi
[00:47] <bencoh> oh, your controle stuff, not the actual stream
[00:47] <Eduard_Munteanu> burek: I know, but that single packet depends on power management settings on the AP, sender and receivers, like the beacon interval, whether they go to sleep inbetween etc.
[00:47] <burek> all the clients have all the streams/files so no need for broadcasting the stream
[00:47] <burek> only syncing the playback time through timestamps broadcast
[00:48] <Eduard_Munteanu> burek: I agree alone it doesn't seem to justify it
[00:48] <burek> Eduard_Munteanu yes, but the same packet will be broadcast at 1 point in time and all devices will receive it at that point in time
[00:48] <bencoh> Eduard_Munteanu: actually the fact that it is sent on beacon/DTIM isnt a big problem for sync
[00:49] <Eduard_Munteanu> But it gets rather nasty once all those nondeterministic lag sources get together.
[00:49] <bencoh> (the fact that most APs/clients suck is, though)
[00:49] <burek> my point is
[00:49] <burek> even if it gets some kind of a lag
[00:49] <burek> all the players will get that packet again at the same time
[00:49] <Eduard_Munteanu> bencoh: yeah, you're probably right. Maybe I've seen devices misbehaving.
[00:50] <burek> they will all lag
[00:50] <burek> thus, they will keep the sync
[00:50] <bencoh> Eduard_Munteanu: most of them are :)
[00:50] <bencoh> (misbehaving)
[00:50] <Eduard_Munteanu> burek: indeed, but it's not deterministic. There's a lot of different hardware.
[00:51] <burek> if 32 bytes udp broadcast packet is not effective enough for the syncing, then you can safely forget about your wifi connectivity in general
[00:51] <Eduard_Munteanu> burek: besides, I don't feel like measuring the lag myself and putting in a magic number, in case the sender is also a player.
[00:51] <burek> Eduard_Munteanu, that's the point
[00:52] <burek> the sender is not playing it aloud
[00:52] <burek> it can be muted
[00:52] <burek> it is only used as a source for syncing
[00:52] <Eduard_Munteanu> Sure, no audio.
[00:52] <burek> all the other players will play in sync
[00:52] <burek> even if in lag
[00:53] <bencoh> burek: that's not true
[00:53] <bencoh> we're talking milisecs here
[00:53] <burek> why it isnt
[00:53] <bencoh> you can have much more jitter than that on a wireless link
[00:53] <burek> so?
[00:53] <Eduard_Munteanu> If you try to ping over wireless you'll see rather non-deterministic delays.
[00:54] <burek> i agree
[00:54] <burek> why would it hurt players?
[00:54] <Eduard_Munteanu> And even if you're right, a closed-loop system is probably more robust across different hardware.
[00:54] <bencoh> so having a few 50ms lag on a single 32bits-udp-packet would be quite common on "working" wireless connections
[00:54] <Eduard_Munteanu> burek: because not all players get the same roundtrip time / lag.
[00:55] <bencoh> oh btw, is this all about just syncing an internal clock, or triggering remote play/pause ?
[00:55] <Eduard_Munteanu> I also don't want to have all players the same "metric" away from the source. E.g. my source is playing videos too.
[00:56] <burek> this is my ping to an ap that is under a mile away from me http://pastebin.com/H3e9dX9p
[00:56] <Eduard_Munteanu> bencoh: syncing playback, not just on play/pause
[00:56] <burek> and you are telling me about 50ms lag on a LAN wifi?
[00:56] <bencoh> burek: quite common in a not-so-busy-but-suboptimal environment
[00:57] <bencoh> crowded urban area for instance
[00:57] <Eduard_Munteanu> burek: think about your local mall with lots of APs around and on the same channel etc.
[00:57] <burek> why would you put them on the same channel
[00:58] <bencoh> 7 packets transmitted, 7 packets received, 0.0% packet loss
[00:58] <bencoh> round-trip min/avg/max/stddev = 6.780/12.861/20.081/4.908 ms
[00:58] <Eduard_Munteanu> burek: I don't, but other channels aren't free either :D
[00:58] <Eduard_Munteanu> The allowed ones at least.
[00:58] <burek> ok, wait, we are loosing the point here
[00:58] <bencoh> indeed :)
[00:58] <burek> even if you have at one moment a lag of 100ms
[00:58] <burek> it doesnt matter too much
[00:58] <Eduard_Munteanu> burek: well, I'm just saying it has to be robust :)
[00:59] <burek> since all the players will get that packet at the same time, since its a broadcast
[00:59] <Eduard_Munteanu> burek: I wouldn't assume that
[01:00] <Eduard_Munteanu> burek: think about it, NTP has trouble being very precise even on a wired LAN, and that's a closed-loop system.
[01:01] <Eduard_Munteanu> On a wired LAN you can get around 5-10ms deltas IIRC, but if you want better, you reach for PTP.
[01:02] <Eduard_Munteanu> Well, it might do better on a small LAN, but still.
[01:03] <burek> well there is even less point to use ntp if you think like that
[01:04] <burek> oh i got it
[01:04] <Eduard_Munteanu> burek: I do have a budget of 1-2 frames, so 40-80ms lag *overall*.
[01:04] <burek> you could send an ARP request
[01:04] <burek> that is an ethernet level packet
[01:04] <burek> and all devices should get it exactly as a single broadcast packet
[01:05] <Eduard_Munteanu> UDP itself doesn't add much lag. Most of the lag is physical stuff, I think.
[01:05] <Eduard_Munteanu> Like how APs and wireless cards react to incoming packets.
[01:06] <burek> i believe that best bet is to get a single beacon (packet) from your AP, received by all your players (assuming they are on the same AP), which would make them play in sync
[01:07] <burek> arp (or any other ethernet level) packet is one of such, because it generates just a single packet sent once (and received by all receivers)
[01:07] <Eduard_Munteanu> burek: I'm not saying it won't work, in fact it probably will in quite a few cases, but I don't want headaches. Besides, I'm reusing a rather popular protocol for time sync, so it's even easier than doing UDP stuff myself. :)
[01:08] <burek> btw, if you can't get such a precise beacon, there is no point in having wifi connectivity, right? :)
[01:08] <burek> oh boy :) just take a look on the internet for a simple udp send/receive example
[01:08] <burek> it's like.. 2 lines of code..
[01:09] <Eduard_Munteanu> NTP was designed to work over the Internet and give you a semi-reasonable synced clock.
[01:09] <Eduard_Munteanu> Even in the presence of large, undeterministic lag on the order of hundreds of ms.
[01:11] <burek> ntp over wifi will work even worse..
[01:11] <burek> round-trip will be averaged
[01:12] <burek> which, in case of such a variable ping time over wifi, will make it so unreliable
[01:12] <Eduard_Munteanu> burek: um, no, that's the point, NTP accounts for round-trip as well. It's not an open-loop system.
[01:13] <burek> well yeah... but if it gets different "ping reply time" each time, whats the point of the average round trip time
[01:13] <Eduard_Munteanu> And you get much more precise timers ticking.
[01:13] <Eduard_Munteanu> Not just a beacon.
[01:13] <Eduard_Munteanu> burek: it just takes longer to converge, actually
[01:14] <cbsrobot> Eduard_Munteanu: looking for a sync player or do you want to implement it ?
[01:15] <Eduard_Munteanu> cbsrobot: I already have one, I was considering reimplementing it on top of a codec library instead of controlling mplayer, though.
[01:16] <Eduard_Munteanu> Assuming I get VDPAU H264 decoding if I feed ffmpeg a video.
[01:18] <Eduard_Munteanu> Hm, libx264 seems to be an encoder only. I wonder what's the actual codec that does hw-accelerated decoding.
[01:19] <burek> http://en.wikipedia.org/wiki/Reference_Broadcast_Time_Synchronization
[01:20] <burek> so it wasnt enough to go to eth (level 2) :) it needs to be level 1
[01:21] <Eduard_Munteanu> burek: yeah, also it's more complex since they do take feedback from roundtrip time.
[01:22] <Eduard_Munteanu> burek: btw, if I wasn't clear, I'm just using a local NTP server, I don't care much about the global clock.
[01:22] <burek> it figures
[01:24] <Eduard_Munteanu> There are a couple of options you can pass the client to make synchronization happen very quickly, in a matter of minutes.
[01:24] <burek> lets get something straight
[01:24] <burek> once you sync your system clocks
[01:24] <burek> there is no need to sync them in the next hour or so (at least)
[01:24] <burek> right?
[01:25] <burek> otherwise something is very wrong with the hardware
[01:25] <burek> or lets make it a minute
[01:25] <Eduard_Munteanu> burek: yes, that's rather accurate
[01:25] <Eduard_Munteanu> Assuming you're using a modern timer in the kernel.
[01:25] <burek> so, once you sync your system clocks on the players, there is no need for syncing anymore, at least for the next minute
[01:25] <Eduard_Munteanu> Yep.
[01:26] <Eduard_Munteanu> burek: however you need to sync playback to the system clock too
[01:26] <burek> yes
[01:26] <burek> you can do that with ffmpeg
[01:26] <Eduard_Munteanu> Oh? How do you mean?
[01:26] <burek> just not with the executable, but if you are using ffmpeg libs
[01:27] <burek> just feed the decoder based on the system clock, instead of an internal
[01:27] <Eduard_Munteanu> I think some formats have a notion of accurate timing themselves, e.g. MPEG. But most players don't care about timing much I think.
[01:27] <Eduard_Munteanu> burek: oh, I see, cool.
[01:27] <burek> well with ffmpeg you control the decoding process, so you can decode a frame whenever you like
[01:27] <burek> with ffmpeg libraries*
[01:28] <Eduard_Munteanu> burek: should I do the decoding synced, or just displaying? I'm worried decoding speed may vary across machines.
[01:28] <Eduard_Munteanu> Decoding lag, even.
[01:28] <burek> DTS is what you care about i guess
[01:29] <burek> no wait
[01:29] <burek> PTS is for a display time right?
[01:29] <Eduard_Munteanu> Those terms seem a bit familiar, but I don't know much about that.
[01:29] <burek> just make sure the frame is displayed at the desired point in time (i guess its a PTS - presentation time stamp or so)
[01:30] <Eduard_Munteanu> Ah.
[01:30] <burek> and stall it (or drop it) if it drifts
[01:30] <Dave92F1> Hi all! Been struggling with this for a while - how can I get ffmpeg to pipe its output to mplayer for display?  I need it to read from /dev/video1 (input 1, norm NTSC).
[01:30] <Eduard_Munteanu> Thanks, that seems like a viable approach.
[01:30] <Eduard_Munteanu> Better than varying the playback speed externally, I mean.
[01:30] <burek> definitely :)
[01:31] <Eduard_Munteanu> One issue with my approach was I had to make many frames keyframes.
[01:31] <burek> i was trying to solve the problem with the ffmpeg executable, thus the udp syncing of the start of the playback of each ad
[01:31] <Eduard_Munteanu> This might alleviate that concern.
[01:31] <Eduard_Munteanu> Ah, I see.
[01:32] <burek> Dave92F1, just output it to stdout
[01:32] <burek> ffmpeg -i INPUT ... -f mpegts -
[01:32] <burek> "-" means stdout
[01:32] <burek> or any other format you need instead of mpegts
[01:33] <Eduard_Munteanu> I think I should do the syncing bits from C directly.
[01:33] <Dave92F1> burek: OK, how do I tell ffmpeg to read from input 1? (I think I can do '-standard NTSC' to get NTSC).
[01:33] <burek> i need to go to sleep :) anyway, good luck with that player Eduard_Munteanu :) i hope you'll showcase it somewhere :)
[01:34] <Eduard_Munteanu> burek: I'm considering publishing it at some point, yeah. Thanks :)
[01:34] <burek> Dave92F1: ffmpeg -f <FORMAT> -i - ...
[01:34] <burek> "-" means stdin in this case, because it is at the position for the input parameter
[01:34] <burek> Eduard_Munteanu +1 :)
[01:35] <burek> oops, Dave92F1, use -f v4l2 -i /dev/video1
[01:35] <burek> http://trac.ffmpeg.org/wiki/How%20to%20capture%20a%20webcam%20input
[01:36] <Dave92F1_> burek: "ffmpeg -f v4l2 -standard NTSC -i /dev/video1 -f mpegts - | mplayer -" gives me "Cannot seek backward in linear streams!"
[01:36] <burek> try for the output: -f rawvideo - | ...
[01:36] <Dave92F1_> burek: Also, ffmpeg seems to read from input 0 by default (input 0 is the S-video input on my capture device; I need to read from input 1 which is the composite video)
[01:38] <Dave92F1_> burek: "ffmpeg -f v4l2 -standard NTSC -i /dev/video1 -f rawvideo - | mplayer -" give me the same seek error
[01:38] <burek> http://www.linuxtv.org/wiki/index.php/V4L_capturing#Capture_from_composite_or_S-video
[01:39] <Eduard_Munteanu> Is there an API for decoding video generically, instead of calling the demuxers, decoders and display stuff myself?
[01:39] <burek> try: -f mpeg -
[01:39] <burek> try: -f mpeg - | mplayer ..
[01:40] <burek> Eduard_Munteanu, take a look at source (directory src/examples i think)
[01:40] <burek> or docs/examples
[01:40] <Eduard_Munteanu> Hm, looking at http://www.ffmpeg.org/doxygen/trunk/examples.html right now
[01:40] <Dave92F1_> burek: -f mepg - | mplayer - also gives seek error.  I looked at that link, but don't see how to select input 1 vs. input 0.
[01:41] <bencoh> there is a way to play from stdin using mplayer, but not this exact one
[01:41] <burek> Eduard_Munteanu http://git.videolan.org/?p=ffmpeg.git;a=tree;f=doc/examples;h=57fd81bfb3f98412bcf188ddd5a7012c8338a196;hb=610a8b1537fe728f4f1e44a5276f225334653123
[01:41] <burek> yeah, thats the same
[01:41] <bencoh> hmm, nevermind
[01:42] <burek> Dave92F1 http://mplayerhq.hu/pipermail/mplayer-users/2005-April/052844.html
[01:42] <Eduard_Munteanu> Would gstreamer be more straightforward? I get a feeling this is a rather lowlevel API.
[01:42] <burek> you can read /dev/video1 from mplayer
[01:42] <burek> directly
[01:43] <burek> Eduard_Munteanu, you can try... i dont know precisely, but i also got a feeling that some things are easier to do in gs... try and see
[01:44] <burek> anyway, off to bed :) gn o/
[01:44] <Eduard_Munteanu> burek: g'night
[01:44] <Dave92F1_> burek: Thanks; I know how to read it with mplayer (mplayer tv:// -tv device=/dev/video1:input=1:norm=NTSC) works fine. But I want to do it with ffmpeg so I can tee the output from ffmpeg to both mplayer (for display) and mencoder (for recording).
[01:45] <bencoh> -f rawvideo pipe:
[01:45] <bencoh> in case it's raw video
[01:46] <bencoh> but you'd have to specify the video size/format to mplayer/mencoder
[01:46] <bencoh> (and framerate)
[01:46] <Dave92F1_> Gotta go now...back later.
[02:56] <Ruler2112> Is there any way to override the ISFT metadata field?  Using -metadata ISFT="whatever" does not work; seems to be locked to Lavf####
[03:00] <drv> i think you want to set "encoder", which will get translated to ISFT
[03:02] <Ruler2112> I've tried that - output file still has it set to 'Lavf55.16.102'
[03:06] <Ruler2112> VirtualDub can set the field to what I want it to be, but I'd like to eliminate it as a dependency of my script if possible.
[03:14] <drv> looks like it always gets overwritten unless you specify bitexact
[03:14] <drv> so you can try -flags bitexact, seems to work here
[03:20] <Ruler2112> Dude, you're awesome!  Works like a charm
[03:20] <Ruler2112> :)
[03:21] <Ruler2112> I do not see bitexact in the help; are there any other effects?
[03:21] <drv> it's probably hidden under the help for 'flags', not sure what else it does
[03:21] <drv> grep source for BITEXACT ;)
[03:23] <Ruler2112> I found this online - EDVAS use only bitexact stuff (except (i)dct)    - It's like defining a word using part of the same word. ;)
[03:30] <Ruler2112> Thanks again for the help.
[03:35] <ez> hi guys, how are you ? Can somebody help me ?
[03:37] <iive> nobody can help you
[03:37] <iive> if you don't say what the problem is
[03:38] <iive> don't ask to ask, just ask.
[03:38] Action: iive out
[03:39] <ez> I am creating a bash script for a customer that converts multiple images into a video-slideshow. I need the imagens to be rendered into a specific order. I can generate the video using the pattern * but i am struggling using individual files, like this syntax. http://pastebin.com/NF8U9bzB Another question is: There is a way to have different times for each image, something like "1st image stays for 2 sec, 2nd for 5 sec, etc"
[03:40] <eZ> sorry, I was busy writing my question lol english is not my native language :)
[03:43] <eZ> it seems to get only the first image ..do I need to copy all images to a temp directory and render using the pattern * or something like that ? Can I control the order of the images and the individual times it will be exposed on video ?
[03:46] <eZ> Could I pipe instead ?
[04:06] <eZ> somebody here ?
[05:09] <mark4o> eZ: ffmpeg -f image2 -pattern_type glob -i "*.png" out.mp4
[05:11] <mark4o> eZ: http://superuser.com/a/619843/2087
[07:03] <eZ> mark4o: I need without the pattern ...
[07:04] <eZ> mark4o: I need to build a specific order, without recopy the files ... there other files on dir
[07:05] <mark4o> eZ: you can use the concat demuxer, see http://superuser.com/a/619843/2087
[07:06] <mark4o> or if you want to keep a separate -i option for each one then use the concat filter
[07:06] <mark4o> https://trac.ffmpeg.org/wiki/How%20to%20concatenate%20%28join,%20merge%29%20media%20files
[07:07] <mark4o> https://trac.ffmpeg.org/wiki/How%20to%20concatenate%20%28join,%20merge%29%20media%20files#filter
[07:09] <mark4o> or for that matter you can use -f image2 -i "{file1,file2,file3}.png"
[07:09] <mark4o> instead of *.png
[07:57] <Ruler2112> Hello.  I'm having a problem with a MKV clip when transcoding to XVID.  Behavior is the same whether I use the built-in mpeg4 codec or libxvid.  The original MKV is of a 4:3 source, but has black bars on the sides, though the resolution is 976:720.  If I resize using -s 530:400 and -vf scale=530:400, the resulting video does not have the black bars.  If I do NOT resize the original video and simply convert it to XVID, the resulting 
[07:58] <Ruler2112> Any ideas of how I could refrain from resizing (with the resulting loss of quality & introduction of digital artifacts) and eliminate the side black bars?
[07:59] <Ruler2112> I've tried faking out ffmpeg by passing the original resolution to the resizing functions, setting both sar and dar using -vf, etc.  After about 40 tries, I decided to ask. :)
[08:11] <mark4o> Ruler2112: Did you try -vf crop=960:720 ? (or whatever the resolution is without the bars)
[08:13] <Ruler2112> I had not, but just did.  Had no effect. :(
[08:16] <mark4o> Ruler2112: Didn't it make the video 960x720 instead of 976x720?
[08:17] <mark4o> If you didn't notice then perhaps it needs more cropping
[08:17] <mark4o> The cropdetect filter should be able to tell you exactly how much
[08:20] <Ruler2112> Just checked the resolution of the output and it did cut it down; not noticeable when watching though.  I'll try chopping it down some more, but I find it curious that simply resizing it (using the same aspect ratio) eliminates them.  A real resize of the entire picture would preserve the bars, would it not?
[08:23] <mark4o> Some codecs only support certain sizes, or sizes that are a multiple of 16 (whole macroblocks).
[08:24] <Ruler2112> I just tried 800x720 and there's still no difference; bars exist and video *looks* to be the same exact size.
[08:24] <mark4o> But yes, if any size is supported, then the bars should not be affected other than shrinking
[08:25] <Ruler2112> I've got to get to bed.  You've given me some ideas to play with tomorrow after work - thank you Mark.
[08:26] <mark4o> ok np, may want to try a different player also
[08:27] <Ruler2112> That's a good idea... I've been using VLC, but have MPC loaded as well.  (Listening to a football podcast in it.)
[08:29] <Ruler2112> Just tried MPC & no luck. :(   Ah well - I'll get back at it tomorrow.  Hope you have a great evening and thanks again.
[09:03] <aquarat> does anyone know if there's any kind of raw-codec support in ffmpeg ? like DNG, REDCode, Cineform, etc.
[09:05] <ebalsley> no red code or cineform support.  I've heard old R3D files are supported from before RED started scrambling their files.  But that's mostly irrelevant because all modern R3D files are scrambled
[09:06] <ebalsley> you can vote up the issue https://trac.ffmpeg.org/ticket/2690
[09:06] <ebalsley> can someone help me understand the --shared and --static configure options and their pros and cons?
[09:09] <aquarat> thanks ebalsley
[10:41] <Apic> A wonderful fine Sweetmorn (UGT)!
[11:17] <xlinkz0> can i make the log put a timestamp before every line?
[11:40] <burek> xlinkz0, try this http://stackoverflow.com/questions/21564/is-there-a-unix-utility-to-prepend-timestamps-to-lines-of-text
[11:40] <burek> ive used ts but you can pick whichever one you like the most
[11:41] <xlinkz0> burek: i'm interested when every log message was recorded
[11:41] <burek> yeah i understand that
[11:41] <xlinkz0> how can a utility know when each line was recorded? :)
[11:44] <kenansulayman> hi!
[11:44] <kenansulayman> What's the fastest way to decode ATRAC files?
[11:45] <burek> xlinkz0, because it is piped to the output of ffmpeg
[11:45] <burek> and it can detect whenever a new line of log is outputed
[11:51] <xlinkz0> burek: but how do i do that with the -report option?
[13:40] <maep> hi, is there a way to signal packet loss to a decoder?
[13:42] <Apic> Good Question!
[13:43] <maep> Is that a 'no'?
[13:43] <Apic> No, that's a "I do not know either, but would really like to hear the Answer from someone competent in here."
[13:44] <maep> ah :)
[13:49] <maep> maybe setting AVPacket.flags to AV_PKT_FLAG_CORRUPT?
[13:49] <maep> is that passed to the decoder?
[18:55] <brontosaurusrex> is there a way to "measure" a complexity of a clip (this will be short clips) ?
[18:55] <durandal_1707> what kind of complexity?
[18:55] <brontosaurusrex> durandal_1707, visual complexity, say visualy rich shot or shot with lots of motion
[18:56] <brontosaurusrex> the idea is "a. from a very long clip take N short random clips" "b. measure "complexity"" "c. keep N/2 of the more "complex" shots"
[18:57] <brontosaurusrex> or N/3 , N/4 < whatever it will work
[18:57] <durandal_1707> there is scene change detect filter...
[18:58] <brontosaurusrex> hmm, yes i have to scene-detect this shots as well
[18:58] <brontosaurusrex> basically it will be probably "If scene-cut was detected then delete"
[18:59] <brontosaurusrex> but that will happen before "complexity" evaluation
[19:00] <durandal_1707> i do not think there is such filter, which will give some number to output that maps to 'complexity'
[19:01] <brontosaurusrex> the idea is to just downsize a bit and then encode with x264 crf and then read the average bitrate
[19:01] <brontosaurusrex> but i doubt that will really work
[19:01] <brontosaurusrex> so, open to some better ideas
[19:07] <userper> Hi there
[19:07] <userper> I wonder if anyone can help... trying to get the libfdk_aac library to work following an Ubuntu compilation from source
[19:08] <brontosaurusrex> userper, and?
[19:08] <userper> Ran the ./configure command as listed on the compilation guide
[19:09] <userper> But I am receiving "Unknown encoder 'lbfdk_aac' when I try to make use of it
[19:10] <userper> libfdk_aac was listed as an enable audio codec during the configure process. But 'ffmpeg -codecs' does not list it
[19:10] <brontosaurusrex> userper, --enable-libfdk-aac --enable-nonfree ?
[19:11] <userper> Yep, I used both of those @brontosaurusrex
[19:12] <userper> Copy/pasted the commands from the compilation guide. Are the instructions you've linked on Crunchbang a better bet?
[19:13] <brontosaurusrex> userper, no, they are pretty old now i suppose
[19:13] <brontosaurusrex> so, last time i tryed a command like "ffmpeg -i in -c:a libfdk_aac -flags +qscale -global_quality 2 -afterburner 1 -vn out.m4a" should work
[19:14] <brontosaurusrex> that should produce an LC AAC thingy
[19:16] <userper> The current git clone doesn't seem to support "-c:a" so I've switched it to use :acodec
[19:17] <brontosaurusrex> mkay, and ?
[19:17] <userper> But that still gives me the "Unknown encoder 'libfdk_aac'" error
[19:18] <userper> "ffmeg -codecs | grep aac"  only lists aac, aac_latm and libvo_aacenc
[19:18] <brontosaurusrex> ffmpeg -codecs | grep fdk , gives me
[19:18] <brontosaurusrex>  DEA.L. aac                  AAC (Advanced Audio Coding) (encoders: aac libfdk_aac )
[19:19] <brontosaurusrex> so i guess you missed something during the compile time
[19:19] <userper> That command doesn't give me any results, just ffmpeg the header
[19:19] <brontosaurusrex> you missed something during the compile time
[19:21] <durandal_1707> userper: actually to get fdk you need to recompile with fdk enabled flag
[19:22] <userper> durandal_1707 - I think I have, that was my reason for moving away from a static build. I included the "--enable-libfdk-aac" switch in the configure command.
[19:22] <adiulici> Hello. I have a problem with ffmpeg: it keeps getting closed and exits with return code "137". It doesn't always do this, but in about 66% of the cases. The command I'm using is: http://pastebin.com/dFQ884VB
[19:24] <durandal_1707> userper: where it is installed?
[19:26] <brontosaurusrex> userper, my latest debian static compile, switches http://paste.debian.net/plain/60915
[19:26] <brontosaurusrex> (note that i don't have a clue about compiling, so all i do is trial/error)
[19:27] <userper> I have noticed the version line I have suggests that I'm not using my compiled version, so going to check: ffmpeg version 0.8.6-4:0.8.6-0ubuntu0.12.04.1, Copyright (c) 2000-2013 the Libav developers
[19:28] <brontosaurusrex> good catch :)
[19:28] <durandal_1707> that is not FFmpeg
[19:29] <userper> Yeah I know, I've seen mentions of the Libav "ripoff"
[19:29] <userper> Can't believe I hadn't spotted it until now
[19:38] <adiulici> Q: I have a problem with ffmpeg: it keeps getting closed and exits with return code "137". It doesn't always do this, but in about 66% of the cases. The command I'm using is: http://pastebin.com/dFQ884VB
[19:40] <userper> Yep sussed it, I was using /usr/bin/ffmpeg (which I thought I'd removed) instead of the compiled version in ~/bin. Thanks, sorry for taking up your time.
[19:42] <Mista_D> Anyway to see "drop_frame_flag" via FFprobe? Need to distinct between drop and non-drop frame videos.
[19:57] <sacarasc> adiulici: Can you paste the full output somewhere, too.
[19:58] <adiulici> Yes, I'll paste it in pastebin. Just a sec
[20:05] <adiulici> sacarasc: this is the output I get from my cron job: http://pastebin.com/B82QqeGe
[20:06] <adiulici> sacarasc: and this is a normal output (when everything goes fine) : http://pastebin.com/z2QYeWt9
[20:25] <llogan> adiulici: does your host kill some processes that last longer than x minute(s)?
[20:27] <adiulici> llogan: I will ask my sys admin, but I'm not sure that's the problem because I measured how long the ffmpeg runs when it gets killed and very often when it gets killed it only runs 2 or 3 seconds. Sometimes it gets to run for 30 seconds also, but it also happened that it ran for 50-60 seconds and finished with no problem.
[20:28] <llogan> maybe, if they do have something implemented, it's triggered by CPU instead
[20:28] <llogan> have you tried on another machine?
[20:30] <llogan> i've never seen mv4 and aic used with libx264 before.
[20:32] <adiulici> I really am not an expert with ffmpeg :) But this command was what worked for me and did it's job well. I wrote the sys admin now a message and asked him if there is a program that's triggered by CPU
[20:32] <adiulici> Thanks
[20:33] <llogan> those flags are ignored
[20:34] <adiulici> Ok, i didn't know. Thanks :)
[20:35] <llogan> https://trac.ffmpeg.org/wiki/x264EncodingGuide
[20:40] <v0lksman> llogan: I've been through that encoding guide and tried a million things with no luck.  Any docs or tips on encoding user submitted video to mp4 that will work on an IOS device?
[20:42] <llogan> v0lksman: which device(s)? they can vary.
[20:44] <v0lksman> well I'm doing my testing from an iphone5...but I need the video to be versitile...say iphone4+
[20:45] <llogan> H.264 Main profile 3.1
[20:46] <llogan> according to: https://developer.apple.com/library/mac/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW8
[20:46] <llogan> (see Preparing Media...)
[20:46] <llogan> -profile:v main -level 3.1
[20:47] <v0lksman> will give that a shot...tried a bunch of profile and level (even ref) settings yesterday with no luck...I'll try to paste some of the commands and see if anyone can poke holes in them
[20:47] <v0lksman> thanks!
[20:48] <llogan> i'm not sure what the max frame size dimensions are though
[20:49] <v0lksman> I think that's listed on that apple page in the same section
[20:50] <llogan> you can use scale filter to automatically scale and keep aspect: -vf scale=1280:-1
[20:50] <llogan> also see -force_original_aspect_ratio in scale: http://ffmpeg.org/ffmpeg-filters.html#scale
[20:51] <llogan> or scale="1280:trunc(ow/a/2)*2" to (probably) avoid a "non-divisable by 2" value
[20:53] <llogan> ...assuming you're downscaling (no need to up it).
[21:01] <v0lksman> so trying to keep it super simple:  https://dpaste.de/Lw0Z
[21:01] <v0lksman> won't stream on iphone5 (IOS7)
[21:03] <v0lksman> basically I'm placing the output file on a webserver and hitting it directly in Safari on the phone
[21:03] <v0lksman> works fine for other videos that I didn't encode
[21:22] <v0lksman> llogan: sorry here's the full output http://pastie.org/8424900
[21:26] <llogan> v0lksman: does it work if you use -profile:v baseline -level 3.0?
[21:29] <v0lksman> http://pastie.org/8424912 - nadda...same behaviour
[21:34] <llogan> iDunno
[21:34] <llogan> out of ideas...
[21:34] <llogan> omit the audio stream with -an
[21:35] Action: llogan ignorantly shooting in the dark
[21:36] <v0lksman> will try that!  thanks for even the blanks...I've tried a TON of different settings..nothing seems to work.  My buddy has a video he encoded in some windows app that works fine. Even tried handbrake no luck
[21:37] <llogan> can you provide the link?
[21:43] <Xx_pk_xX> hello guys. I have a ffmpeg releated question. Is there a way to find exact frame using ffmpeg? Or exact sound. What I need to do is an automated system to find exact time of commercials in h264 mp4 files recorded from TV. Is there a way to do something like that using ffmpeg? Let's say I have a frame of ending commercial, or a sound file, is there a way to find it in a video?
[21:46] <llogan> Xx_pk_xX: an example finding the exact frame (only works if the frame and image are literally exactly the same) http://superuser.com/a/663947/110524
[21:46] <llogan> ...so that probably won't work for you now that i actually read the question
[21:47] <Xx_pk_xX> yes, I don't think that would work, because frames maybe won't be exactly same
[22:02] <v0lksman> llogan: hey...in moving the file it works on a different host.  So it must be something with my dev server.  Thanks for the help though.  I should have tried that earlier
[22:04] <llogan> weird
[22:52] <pure> Hi, is there anything wrong with what I'm doing here? http://pastebin.com/anLpVVf3
[22:52] <pure> I'm getting a blank stream when I look on twitch.
[23:36] <brontosaurusrex> pure, any errors if you redirect this to a file?
[23:52] <llogan> pure: this script again... you probably need to add -pix_fmt yuv420p, but this is a guess since you did not provide the complete console output
[23:53] <llogan> which is a requirement when asking questions here
[00:00] --- Thu Oct 24 2013


More information about the Ffmpeg-devel-irc mailing list