[Ffmpeg-devel-irc] ffmpeg.log.20180418

burek burek021 at gmail.com
Thu Apr 19 03:05:01 EEST 2018


[00:49:37 CEST] <MASM> How can i show the data that are incoming in the ffmpeg?, i have a command  " ffmpeg -listen 1 -i tcp:ip:port output.mp3
[00:53:10 CEST] <klaxa> what do you mean? like what clients connect?
[00:53:17 CEST] <klaxa> i don't think thats possible :S
[00:53:28 CEST] <klaxa> api doesn't expose that
[00:54:09 CEST] <klaxa> i've thought about that as well, might be useful to add
[00:54:33 CEST] <MASM> i want to decode g726, but with that command, the ffmpeg doesn't do anything, and when i create a simple tcp server it show raw data incoming, and when ffmpeg it doesn't show and do anything
[00:55:07 CEST] <klaxa> hmm... what does it say when you add -loglevel debug ?
[00:57:09 CEST] <MASM> https://pastebin.com/MmqWCuPD
[00:57:35 CEST] <MASM> it show nothing
[01:00:00 CEST] <MASM> klaxa: and when i run a nodejs TCP Server for test, it receives data
[01:02:00 CEST] <klaxa> hmm i can't reproduce that
[01:02:31 CEST] <klaxa> i ran: ffmpeg -listen 1 -i tcp://192.168.1.5:13000 -c copy test.mkv in one terminal
[01:02:41 CEST] <klaxa> and cat somefile.mkv | nc 192.168.1.5 13000
[01:03:05 CEST] <klaxa> and it worked as expected, ffmpeg read somefile.mkv over the tcp socket and wrote test.mkv
[01:11:11 CEST] <klaxa> hmm
[01:11:17 CEST] <klaxa> but you can write files in general right?
[01:11:24 CEST] <klaxa> like touch asdf produces asdf ?
[01:12:05 CEST] <MASM> I teste the command " ffmpeg -loglevel debug -listen 1 -i tcp://192.168.0.11:13000 -c copy test.mkv "
[01:12:10 CEST] <MASM> but no file are created
[01:12:40 CEST] <MASM> [AVIOContext @ 0x9d1ffe0] Statistics: 379545 bytes read, 0 seeks
[01:12:54 CEST] <MASM> and there is packages incoming or data incoming
[01:13:04 CEST] <MASM> and no file is create, when i search for it
[01:13:17 CEST] <MASM> i have the permissions, I am the root
[01:13:50 CEST] <MASM> it is a test server ubuntu 14.04, in virtual machine
[01:13:55 CEST] <MASM> vmware
[01:14:15 CEST] <klaxa> that's really weird
[01:15:12 CEST] <MASM> the source is a dvr that point to ip and port, it send the audio and video to separate ports
[01:15:20 CEST] <MASM> i found a solution to get the video
[01:15:23 CEST] <klaxa> like i said, it's working as expected on my end, i would be highly surprised if there was a bug in 3.3.3 that's not in 3.4.2
[01:15:36 CEST] <MASM> but when i try to get the sound i can't
[01:16:12 CEST] <klaxa> can you create some sort of sample?
[01:16:23 CEST] <MASM> ok
[01:16:29 CEST] <MASM> like what?
[01:16:31 CEST] <klaxa> one that someone else could "replay" with netcat
[01:16:46 CEST] <klaxa> you said a normal tcp listening socket receives data
[01:17:11 CEST] <MASM> i could point to your ip
[01:17:13 CEST] <MASM> and port
[01:18:37 CEST] <klaxa> ok i'll send you hostname and port in a query
[01:20:42 CEST] <klaxa> huh...
[01:20:53 CEST] <klaxa> well netstat says a connection is there at least
[01:21:03 CEST] <exastiken__> Are there any decoding options that use HEVC/x265?
[01:21:09 CEST] <MASM> i redirected it to you?
[01:21:15 CEST] <MASM> !*
[01:21:38 CEST] <klaxa> ok, does the stream stop at some point?
[01:22:21 CEST] <MASM> i need to send a command to it, to send audio to the ip and port
[01:23:55 CEST] <klaxa> well i got:
[01:24:00 CEST] <klaxa> >[AVIOContext @ 0x55b990c09600] Statistics: 1049597 bytes read, 0 seeks
[01:24:00 CEST] <klaxa> >tcp://0:12345: Invalid data found when processing input
[01:24:30 CEST] <MASM> it is g726
[01:24:42 CEST] <MASM> i read it in the manual
[01:25:26 CEST] <MASM> "The server needs to perform G726 decoding after receiving the data, otherwise can not play."
[01:26:08 CEST] <klaxa> hmm maybe you need to specify -c:a g726 before -i tcp://...
[01:26:22 CEST] <MASM> i tried it but it didn't work
[01:29:31 CEST] <klaxa> hmm i don't know what else to try then, maybe someone else does
[01:31:37 CEST] <MASM> klaxa: thanks for your help
[02:23:55 CEST] <Swervz> Hey does anyone know why when I use ffmpeg -i filename.mkv -vcodec copy -acodec copy 1.mp4 to remux a mkv into a mp4 it changes the framerate from constant to variable? I never specified to use a different framerate and want it to be constant like the source file
[02:30:32 CEST] <c_14> try -vsync passthrough and/or -copyts
[02:32:11 CEST] <Swervz> Ok I'll see if those work thank you
[02:39:02 CEST] <Swervz> Do I use those after the input file? I tried them on their own and then together but it still outputs a file with a variable framerate
[02:39:11 CEST] <c_14> eeeeeh, I think so
[02:40:25 CEST] <Swervz> the original file has a constant framerate of 23.976, the output has a framerate of 23.976 with a min of 23.256 and a max of 25.000
[02:40:37 CEST] <Swervz> I don't know why this has never happened before
[02:42:31 CEST] <c_14> maybe the input isn't actually cfr?
[02:42:38 CEST] <c_14> have you tried dumping and comparing the frame timestamps?
[02:44:10 CEST] <Swervz> I'm not sure how I do that
[02:49:37 CEST] <c_14> ffprobe -show_frames
[02:49:44 CEST] <c_14> you might be able to just diff that
[02:50:04 CEST] <c_14> you'll probably want -select_streams v though
[02:54:12 CEST] <Swervz> is this what I want to look at? pkt_duration_time=0.041000
[02:54:47 CEST] <c_14> you want the pkt_pts
[02:55:04 CEST] <c_14> and maybe pkt_dts and best_effort_timestamp and pkt_duration or so
[02:55:08 CEST] <c_14> those are the interesting ones
[02:55:51 CEST] <c_14> you can limit what's output with -show_entries frame=best_effort_timestamp,pkt_duration,pkt_dts,pkt_pts
[02:57:20 CEST] <Swervz> ah ok so how do I know its cfr from the output?
[02:58:20 CEST] <c_14> I was going to say just compare the two files and check if any of the timestamps differ
[02:58:42 CEST] <c_14> If they do, something's weird and I don't know. If they don't, something else is weird and your source file either isn't cfr or the dest is but isn't detected as such
[02:59:32 CEST] <Swervz> can I output this to a file to diffcheck on?
[03:00:14 CEST] <c_14> sure, just pipe stdout to the file
[03:01:50 CEST] <Swervz> Running it now, I think it's gonna take a while
[03:01:56 CEST] <Swervz> Video is around an hour
[03:08:38 CEST] <kepstin> note that mkv rebases all the timestamps to a millisecond timebase, so it doesn't preserve them accurately anyways. But it should be far less variation than that.
[03:09:20 CEST] <Swervz> It just needs to be constant because adobe premiere has desync issues with vfr
[03:22:51 CEST] <Swervz> c_14, the two outputs seem completely different, original mkv on the left and output mp4 on the right https://i.imgur.com/PyPTPds.png
[06:48:16 CEST] <shtomik> Good morning, guys, maybe somebody knows, how to convert QImage RGB32 to AVFrame ?
[06:48:48 CEST] <shtomik> I have a filter for pic, I need only AVframe with pointer to QImage data
[08:37:28 CEST] <acresearch> people i have a set of images (3840 x 2160) pixels, this should be 4K quality right?    anyway i have an ffmpeg command to combine them all into a video, but the resulting video is low quality, anyone can help me? this is the command: ffmpeg -f image2 -i video%4d.png -r 30 -vcodec libx264 -pix_fmt yuv420p -acodec libvo_aacenc -ab 128k -profile:v high -level 4.2 video.mp4
[08:38:39 CEST] <SimAV> acresearch, you probably want to set the video bitrate, too, not only the audio bitrate
[08:38:48 CEST] <acresearch> it is audioless
[08:39:03 CEST] <SimAV> " -acodec libvo_aacenc -ab 128k"
[08:39:35 CEST] <acresearch> SimAV: i have this string in my command
[08:39:49 CEST] <SimAV> acresearch, this part of your ffmpeg command reads: "use the audio codec provided by libvo_aacenc and encode audio at the bitrate of 128 kbit/s"
[08:40:07 CEST] <acresearch> oh, so i should remove it
[08:40:09 CEST] <acresearch> ok
[08:41:08 CEST] <SimAV> probably yes, if you don't have audio. And instead, you want to set the bitrate (or "data-rate") of your video
[08:41:30 CEST] <acresearch> isn't that -r 30?
[08:41:37 CEST] <SimAV> that is the framerate
[08:41:41 CEST] <acresearch> aha
[08:41:42 CEST] <SimAV> how many pictures per second
[08:41:47 CEST] <acresearch> i see
[08:42:17 CEST] <SimAV> what you want to set is how much storage space per second you are willing to sacrifice for your resulting video
[08:42:48 CEST] <acresearch> SimAV: hmmm
[08:43:05 CEST] <acresearch> what does the command string look like?   -c:v  ?
[08:43:13 CEST] <SimAV> if ffmpeg has to squeeze your 4K video in 5 bytes/second, you can't expect more than a uniformly colored image
[08:43:27 CEST] <acresearch> SimAV: i see
[08:43:33 CEST] <SimAV> I guess your complaints are about the video being "blocky" / unsharp / ...?
[08:43:38 CEST] <acresearch> oohhh ok i see what this means
[08:43:45 CEST] <acresearch> SimAV: true
[08:44:37 CEST] <acresearch> i should use -b:v   correct?
[08:44:45 CEST] <SimAV> yes
[08:45:02 CEST] <SimAV> how large are your input images?
[08:45:04 CEST] <acresearch> what would be a good value for 4k?
[08:45:17 CEST] <SimAV> acresearch, that totally depends on the data you want to compress
[08:45:35 CEST] <acresearch> (3840 x 2160) pixels
[08:45:42 CEST] <SimAV> no, in bytes.
[08:45:50 CEST] <acresearch> let me try lossless and see how large is the file, it is only 1 minute long
[08:45:57 CEST] <acresearch> ah
[08:46:03 CEST] <SimAV> what is average filesize of your images?
[08:46:12 CEST] <acresearch> 1,728,837 bytes
[08:46:51 CEST] <acresearch> -b:v 2M   ?
[08:47:10 CEST] <SimAV> ok, nearly 2 MB. You want 30 of them per second in your resulting video. And megabits are 8*megabytes.
[08:47:35 CEST] <acresearch> wow this is complicated
[08:47:38 CEST] <acresearch> 1 second
[08:47:45 CEST] <SimAV> no, its just very basic math
[08:47:59 CEST] <acresearch> SimAV: no the concept of coming up with the commands
[08:48:17 CEST] <acresearch> -b:v 480 ?
[08:48:33 CEST] <SimAV> so you could try 2000k * 30 * 8, e.g. "-b:v: 480000k"
[08:49:06 CEST] <acresearch> wait 480K not 480M?   i got 480,000,000
[08:49:22 CEST] <acresearch> 2MB is 2,000,000 bytes right?
[08:49:39 CEST] <SimAV> yes?
[08:50:01 CEST] <SimAV> times 30 (frames per second), times 8 (bits per byte)
[08:50:33 CEST] <acresearch> yes 480,000,000 not 480,000      or did i do the wrong math?
[08:50:49 CEST] <acresearch> i just want to make sure i learn correctly
[08:50:53 CEST] <SimAV> 480K = 480 000
[08:51:12 CEST] <SimAV> 480M = 480 000 000
[08:51:30 CEST] <SimAV> and we have 2M * 30 * 8...
[08:51:40 CEST] <acresearch> = 480M
[08:51:42 CEST] <SimAV> yes
[08:51:44 CEST] <acresearch> ok
[08:51:53 CEST] <acresearch> so -b:v 480M ?
[08:51:59 CEST] <SimAV> try that
[08:52:12 CEST] <acresearch> ok so this is my command: ffmpeg -f image2 -i Movie/video%4d.png -r 30 -vcodec libx264 -b:v 480M -pix_fmt yuv420p -profile:v high -level 4.2 video.mp4
[08:52:13 CEST] <SimAV> you probably can have goodlooking results with less, but not with orders of magnitude less.
[08:52:26 CEST] <acresearch> is there anything unnessesary i should remove (like the sound)?
[08:52:37 CEST] <acresearch> SimAV: ok
[08:59:20 CEST] <acresearch> SimAV: ok i just completed the video. it is 143MB but still pixilated
[08:59:54 CEST] <acresearch> SimAV: maybe there is a problem with this -pix_fmt yuv420p ?
[09:06:23 CEST] <SimAV> acresearch, you can try yuv444p and -preset:v slower
[09:07:15 CEST] <acresearch> SimAV: ok let me try
[09:11:54 CEST] <tringuyen> hello guys
[09:13:04 CEST] <tringuyen> uhm, should i ask question about ffmpeg in here?
[09:25:19 CEST] <SimAV> acresearch, did you get now a satisfactory result?
[09:25:31 CEST] <acresearch> SimAV: still computing
[09:25:57 CEST] <acresearch> SimAV: 00:30 from 01:20
[09:26:54 CEST] <SimAV> tringuyen, probably yes?
[09:41:09 CEST] <tringuyen> hi guys, I looking for the command line to generate Webvtt
[09:41:45 CEST] <tringuyen> i need to create preview-thumbnail like this
[09:42:20 CEST] <tringuyen> https://support.theoplayer.com/hc/en-us/articles/207460505-Preview-Thumbnails-in-1-X
[09:42:59 CEST] <tringuyen> currently, i can create thumbnail base on duration( need to slip video into 100 piece) with command line belows:
[09:43:15 CEST] <tringuyen> I get duration for video (Cause: requirement required 100 sections for each video) #Get duration of mp4 file need to generate thumbnail duration="$(ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 $video)" Then, I creating thumbnail with duration above: #Create folder to store thumbnail if not exist. mkdir -p $BASEDIR/$filename   #Create thumbnail with size 200:100 ffmpeg -i $video -filter:v scal
[09:43:54 CEST] <tringuyen> but now, i dont know to extract frames timestamps to use with vtt file
[09:44:47 CEST] <acresearch> SimAV: much better quality, but still not HD, can the command be optamised?
[09:44:50 CEST] <tringuyen> i tried with: ffmpeg -f lavfi -i "movie=test639.mp4[out0+subcc]" -map s -f segment -segment_time 10 -segment_format webvtt TestVtt-%05d.vtt
[09:45:11 CEST] <tringuyen> but it was says: Output file is empty, nothing was encoded (check -ss / -t / -frames parameters if used)
[09:46:56 CEST] <SimAV> acresearch, I'm really wondering what kind of images you are trying to put together...
[09:47:34 CEST] <acresearch> SimAV: they are images of a molecule in motion, generated in a program called PyMOL and each frame exported to an image
[09:48:46 CEST] <SimAV> acresearch, can you post a sample image somewhere?
[09:49:51 CEST] <acresearch> SimAV: https://pasteboard.co/Hh9xTtd.png
[09:49:53 CEST] <SimAV> I can't believe that x264 still delivers unsatisfactory results at nearly half a gigabit...
[09:50:10 CEST] <acresearch> SimAV: the movie size is 150MB
[09:50:19 CEST] <acresearch> 1:18 minutes
[09:50:38 CEST] <SimAV> acresearch, at 30 frames per second?
[09:50:47 CEST] <acresearch> yes
[09:50:55 CEST] <SimAV> most of the background is actually black...
[09:51:04 CEST] <acresearch> less pixilated than previously but still some pixilation
[09:51:08 CEST] <acresearch> SimAV: true
[09:51:14 CEST] <acresearch> for the entire movie
[09:51:26 CEST] <SimAV> acresearch, can you paste the full ffmpeg line you are using?
[09:51:48 CEST] <acresearch> SimAV: ffmpeg -f image2 -i Movie/video%4d.png -r 30 -vcodec libx264 -b:v 480M -pix_fmt yuv444p -preset:v slower -level 4.2 video.mp4
[09:52:56 CEST] <SimAV> acresearch, can you paste as well a screenshot of the resulting video?
[09:53:07 CEST] <acresearch> SimAV: ok 1 moment
[09:53:08 CEST] <tringuyen> guys, if you have time, please help me with this pastbin: https://pastebin.com/wCse1sX9
[09:54:30 CEST] <acresearch> SimAV: https://pasteboard.co/Hh9zLyq.png
[09:54:46 CEST] <acresearch> SimAV: it may not be clear the pixles are there because the image is made smaller by the website
[09:54:58 CEST] <acresearch> SimAV: but you might see it in some of the thin lines
[09:55:29 CEST] <acresearch> SimAV: i will be presenting this video on a conference so i don't want it to look cheep
[09:56:57 CEST] <SimAV> acresearch, I think I don't see what you want to show...
[09:57:28 CEST] <SimAV> acresearch, for scientific conferences i strongly recommend to rescale the video down to something the beamer can handle natively...
[09:58:33 CEST] <SimAV> acresearch, might it be that your problem is actually aliasing? (as the player probably can't output the video in its original size)
[09:58:49 CEST] <acresearch> hmmm
[10:00:18 CEST] <acresearch> well the quality is ok for me at the moment, but if i can make it better it will help. you know most reseach funders are non-scientists and they will like to see good graphics
[10:00:39 CEST] <SimAV> but on 800x600 px beamers
[10:00:40 CEST] <acresearch> anyway i have to go to a meeting, i really appriciate your help :-)
[10:00:55 CEST] <acresearch> SimAV: i am not sure how the beamer will be in the conference
[10:01:07 CEST] <SimAV> at least i have only seen _very_ few fullHD beamers on conferences
[10:01:25 CEST] <acresearch> you could be right... anyway i think this might be ok,,, thank you very much :-)
[10:04:15 CEST] <tringuyen> SimAV, would you help me with https://pastebin.com/wCse1sX9, please
[10:17:03 CEST] <tringuyen> hi guys, can we create webvtt file by using ffmpeg tool?
[11:04:55 CEST] <utack> hope i can ask something semi-ot here: is there another site like http://screenshotcomparison.com/ to compare encode screenshots easily?
[11:12:54 CEST] <BtbN> utack, what does that even do? Just tell you if two pictures are identical, while allowing some fuzz?
[11:22:40 CEST] <utack> BtbN allows you to compare two screenshots on mouse over
[11:22:51 CEST] <utack> but it seems dead
[11:23:02 CEST] <BtbN> compare in what way?
[11:23:55 CEST] <utack> it is for humans, you can look at them in comparison. when you hover the mouse it changes between screenshots
[11:24:07 CEST] <utack> and you can see if you like one or the other better
[11:25:24 CEST] <utack> kind of like this site, but with a full switch on hover instead of the bar to slide https://people.xiph.org/~tdaede/av1stilldemo/
[11:40:24 CEST] <zerodefect> I have a side-car subtitle file (in both SRT and STL) that I would like to read. I'm guessing it's possible to use the C-API demuxer API to encapsulate in AVPacket?
[11:58:54 CEST] <zerodefect> I feel like an idiot. I was writing a simple test app but had forgotten to call 'av_register_all()'. Ignore my previous request :)
[12:03:23 CEST] <JEEB> well in the latest FFmpeg that isn't needed :)
[12:07:01 CEST] <zerodefect> @JEEB Did not know that. Good to know. Thanks!
[12:08:08 CEST] <JEEB> basically because 99.9% of all people want to enable everything, and the whole external registration thing was impossible (it required internal symbols)
[12:08:29 CEST] <JEEB> so it was considered that if and when we want to support external things on runtime, then that will have to be revisited
[12:10:54 CEST] <zerodefect> Ah ok. Interesting. I plan to transition to v3.4.2 as that will be the version by default in Ubuntu's Beaver v18.04 (which is released at the end of the month). Do you know off the top of your head if the call will be needed in that version? Just more curious than anything. Don't worry if you don't know
[12:12:31 CEST] <JEEB> it is needed
[12:12:41 CEST] <JEEB> also it will not break, just become a warning
[12:12:48 CEST] <JEEB> in the newer versions
[12:13:04 CEST] <JEEB> (we finally branched out 4.0 a day or two ago)
[12:14:01 CEST] <zerodefect> Thanks
[12:54:02 CEST] <Chloe> zerodefect: you will still need to call avdevice_register_all() in order to use devices however
[12:54:35 CEST] <Chloe> due to how libavdevice is an extension to libavformat (it needs some way to register its components within libavformat)
[12:54:57 CEST] <Chloe> but only devices and protocols still need manual initialisation now I think
[12:55:38 CEST] <zerodefect> Ok, thanks. At the moment using software mux/demux/enc/dec.
[13:24:45 CEST] <cpjoe> I am currently evaluating FFMPEG as a video generation solution for the business I work for. I am getting slightly overwhelmed by commercial use of H.264 to say the least and would like some clarification on the commercial use of this. We plan to use FFMPEG to generate videos which would be viewable to our end users free of charge. We work on a subscription model however so in effect there would be someone paying for the end users to h
[13:25:22 CEST] <cpjoe> Does this mean that we need a license for H.264 or not as I am not really sure on this? Can anybody here offer any advice on this?
[13:26:29 CEST] <shtomik> Hi to all guys, maybe somebody knows, how to convert QImage RGB32 to AVFrame(RGB32)? Without copy memory?
[13:26:35 CEST] <cpjoe> Is there an alternate to H.264 which would mean that I don't even need to worry about the use of the video in a commercial sense as this is becoming a bit of a headache from a legal perspective?
[13:28:42 CEST] <iive> cpjoe, ffmpeg simply doesn't bother with patent stuff. It's your problem if there are patent claims on some of the technologies, and how to pay for them aka mpeg-la .
[13:29:20 CEST] <shtomik> @cpjoe Hi, openh264 is free to use.
[13:30:16 CEST] <kepstin> it's also a pretty terrible encoder, and iirc still constrained baseline only, but it'll do for video conferencing
[13:30:17 CEST] <iive> ffmpeg is lgpl, or gpl if you use it with GPL library, like x264 . If you use existing ffmpeg build, you have no issues with licenses. maybe just provide link to the website.
[13:30:55 CEST] <cpjoe> So x264 is the way to go then to avoid all these issues?
[13:31:05 CEST] <kepstin> cpjoe: but yeah, ffmpeg doesn't have anything to say about patent licenses at all, and doesn't include any patent license grants. Talk to a lawyer.
[13:32:18 CEST] <cpjoe> Thanks for the responses so far. Will be back in an hour so any other advice is much welcome!
[13:32:21 CEST] <zerodefect> Open Sales Solutions LLC deal with the commercial licensing/use of x264.
[13:32:38 CEST] <zerodefect> I suggest you get in contact with them and ask our questions
[13:32:43 CEST] <cpjoe> OK
[13:32:44 CEST] <zerodefect> *our=your
[13:32:51 CEST] <zerodefect> They are very helpful
[13:33:12 CEST] <shtomik> its ture.
[13:33:12 CEST] <kepstin> yeah, I don't know if they handle the patent licensing, or just the code/copyright licensing.
[13:33:34 CEST] <kepstin> (x264 is available commercially licensed for non-gpl use)
[13:33:42 CEST] <shtomik> Guys, what about my question, smbody know ?
[13:34:01 CEST] <shtomik> whats am I doing wrong?
[13:35:23 CEST] <kepstin> shtomik: you could maybe do that by creating an AVBuffer/AVBufferRef around the QImage with a custom free function.
[13:37:24 CEST] <shtomik> @kepstin can AVBuffer convert to AVFrame?
[13:37:44 CEST] <kepstin> shtomik: an AVFrame holds references to AVBuffer(s) which hold the actual data.
[13:39:00 CEST] <shtomik> @kepstin oh sorry, thanks so much! I googled some variant with avpicture_fill and av_image_fill_arrays, but it didnt work (
[14:28:02 CEST] <kepstin> shtomik: keep in mind that the AVFrame owns the memory - you have to make sure that there's no other code that might use or modify the memory of the QImage while it's attached to the AVFrame.
[14:28:10 CEST] <kepstin> otherwise bad things will happen :)
[14:28:27 CEST] <shtomik> @kepstin thanks ;)
[14:28:32 CEST] <kepstin> (if you can't guarantee that, it's probably better to copy the data)
[14:29:05 CEST] <shtomik> @kepstin I know it, thanks, but I cant conver, and set the pointers to data
[14:29:41 CEST] <kepstin> you can't just "convert" it, because ffmpeg doesn't know how to free memory allocated from the C++ code in Qt
[14:38:29 CEST] <kepstin> you'd want to use av_buffer_create(), where the "data" parameter gets the QImage->bits() value, "opaque" is set to a pointer to the QImage, and "free" is a function (that you write) which takes that QImage pointer and frees it (running destructors, etc).
[14:40:36 CEST] <kepstin> filling in the AVFrame is kinda tricky, but for a packed format like this it should be basically setting frame->buf[0] = bufferref; and frame->data[0] = bufferref->data; in addition to filling in all the other fields with format, frame size, etc.
[14:45:24 CEST] <shtomik> @kepstin oh thanks, Im going to impliment this
[15:02:35 CEST] <shtomik> @kepstin all is good, and a complite it. but data of AVBuffer doesnt set ;(
[15:03:20 CEST] <shtomik> @kepstin (uint8_t*)QImage->bits(); set -  to bufferref data
[15:09:30 CEST] <shtomik> @kepstin buffer data is empty, lol
[15:23:58 CEST] <zerodefect> @shtomik, are you doing something like this? https://pastebin.com/GzMiupj3
[15:24:25 CEST] <zerodefect> In my example, I'm supposing that QImage is dynamically allocated
[15:25:27 CEST] <zerodefect> You need to ensure that the lifetime of QImage is longer than lifetime of AVFrame
[15:26:30 CEST] <shtomik> @zerodefect Yes, but with some type cast and with no deprecated funcs
[15:27:47 CEST] <shtomik> @zerodefect Im pass QImage& to func, setup a AVFrame and check it, thanks for reply!
[15:29:45 CEST] <zerodefect> And you're also setting pAVFrame->data[0] to point to pIm->bits()?
[15:31:15 CEST] <shtomik> @zerodefect no, frame->data[0] = bufferref->data;
[15:32:55 CEST] <zerodefect> Ok. That's fine. So have you checked value of pBufferRef->data in debugger immediately after calling 'av_buffer_create()'?
[15:34:10 CEST] <zerodefect> I have a very simple sample app which is extract AVPacket's out of a SRT subtitle file.  Do I need to 'decode' those AVPackets? My goal is to encode the data into DVB Bitmap/Subtitles.
[15:37:16 CEST] <shtomik> @zerodefect Yes, \0...
[15:37:20 CEST] <shtomik> @zerodefect ;(
[15:38:54 CEST] <zerodefect> @shtomik, I'll make a guess and suggest that pIm->bits() is returning nullptr?
[15:40:27 CEST] <shtomik> @zerodefect looks like ((( and I dont understand why
[15:41:04 CEST] <zerodefect> ?
[15:41:05 CEST] <shtomik> @zerodefect img.isNull() == false
[15:41:14 CEST] <shtomik> @zerodefect why bits() == nullptr?
[15:42:08 CEST] <zerodefect> On our own there, not a Qt dev. Sorry.
[15:44:39 CEST] <shtomik> @zerodefect thanks so much! I think that trouble really in bits()&
[16:07:09 CEST] <acresearch>  /exit
[16:07:10 CEST] <acresearch> exit
[17:15:07 CEST] <mort> Currently, avcodec_find_encoder and avcodec_find_decoder will return the first AVCodec they find which can encode or decode the desired codec. What if there was an environment variable to choose what codec to use instead?
[18:28:21 CEST] <stephen> When running probe, what does (und) labelled on a stream mean?
[18:31:20 CEST] <ChocolateArmpits> undefined ?
[18:52:23 CEST] <stephen> I guess maybe I start from the beginning
[18:52:35 CEST] <stephen> I have a set of 8 video files
[18:52:44 CEST] <stephen> these are all streams from youtube of the same video
[18:52:52 CEST] <stephen> Just their different encodings
[18:53:02 CEST] <stephen> I'm trying to do some forensics on them
[18:53:32 CEST] <stephen> One of the first things I wanted was to know which is the original (or closest version to that)
[18:53:52 CEST] <stephen> So I went with the earliest original creation time
[18:59:00 CEST] <stephen> Here's ffprobe output: https://pastebin.com/hbkw8Hre
[19:00:04 CEST] <stephen> Hmm... well it has und, but there's no way that's the origina;
[19:00:29 CEST] <stephen> The original must not be here, because that's the earliest, but it doesn't have the sound stream
[19:03:51 CEST] <CoreX> i dont think youtube keeps the orignal video from the user as they reencode it but i might be wrong JEEB could tell you if he see's this
[19:04:36 CEST] <furq> they used to
[19:04:38 CEST] <furq> i don't know if they still do
[19:05:09 CEST] <furq> you definitely can't get the original stream if you're not the uploader though
[19:05:15 CEST] <furq> you could briefly but they nixed that years ago
[19:06:24 CEST] <stephen> Here's ffprobes on all the streams:
[19:06:26 CEST] <stephen> https://pastebin.com/wR2Urhdd
[19:06:40 CEST] <stephen> It'd be nice to have the original, but likely not necessary
[19:07:20 CEST] <stephen> So, for context, I'm fairly certain this user is hiding binary content of some sort in this and other videos
[19:07:50 CEST] <stephen> How to extract that has eluded a large group for quite some time.
[19:08:43 CEST] <stephen> I've got a ton of development experience in video, and quite a bit in codec research, but it was windows tools back then
[19:09:10 CEST] <stephen> The idea that I haven't pinged this channel yet to understand ffmpeg more franky baffles me
[19:10:51 CEST] <Bombo> hi
[19:13:48 CEST] <furq> the obvious ways i can think of to hide binary data aren't things that would survive transcoding
[19:14:00 CEST] <furq> but i assume you think he's distributing this over youtube, in which case it'd have to be
[19:14:13 CEST] <furq> without account sharing, anyway
[19:14:22 CEST] <stephen> If I reference other videos or something, or you want to know more about wtf I am investigating, you can read the wiki @ https://www.unfavorablesemicircle.com/wiki/UnfavorableSemicircle_Wiki or https://www.reddit.com/r/UnfavorableSemicircle/
[19:14:31 CEST] <alexpigment> seems like if they wanted to share the original, they'd uploda to Vimeo
[19:14:49 CEST] <Bombo> i'm encoding images i took with a webcam every 60s for 24h (*png) to a video with this: "ffmpeg -pattern_type glob -i 'Webcam-*.png' [...] -r 25" which does work, but it's pretty fast, can i slow that down? i guess i need to tell ffmpeg to use every frame/png like twice or 5x
[19:15:01 CEST] <stephen> Well, understand, whomever did this is an expert beyond experts in video encoding
[19:15:14 CEST] <furq> is this one of those test pattern accounts or something
[19:15:21 CEST] <stephen> No it is not
[19:15:28 CEST] <stephen> Already been researched
[19:16:24 CEST] <stephen> Over the years we've done a LOT of research on origins, and the author most definitely wants us to figure it out. We're not trying to hack something that wants to be kept private
[19:16:57 CEST] <alexpigment> Bombo: i think you can do -r before the -i, and you can specify fractional frame rates too
[19:17:01 CEST] <alexpigment> like -r 1/5
[19:17:08 CEST] <stephen> Warning: This project is the kind of thing that starts occupying a lot of your time and mental resources for very little payoff. If you are troubled by that sort of thing, tune out
[19:17:23 CEST] <alexpigment> i haven't tested in a while; you may need to use -framerate instead. i remember there was some change with that syntax
[19:17:37 CEST] <furq> -r is normally mapped to -framerate on inputs
[19:17:46 CEST] <furq> but yeah it's better to use the actual option instead of an alias
[19:18:47 CEST] <stephen> So the creator has modified at one point or another just about every thing you can think of on these videos.
[19:20:33 CEST] <stephen> They will often do things like misreport their length, not display frames (freeze) and resume, have really strange dimension settings (video is 64x64, height and width defaults are set to 50x50, but the stream is set to 0x0 (actually just 0))
[19:20:58 CEST] <stephen> Currently I'm researching how to see the frame data that's not being shown on screen
[19:22:20 CEST] <Bombo> alexpigment: ok i'm trying that
[19:23:14 CEST] <stephen> I ran into something called mf_mt_minimum_display_aperture
[19:23:41 CEST] <stephen> Basically anything that performs a sort of crop on a video but doesn't discard the actual frame data
[19:24:11 CEST] <stephen> And how I might reverse that crop, or at least read the frame data out
[19:24:41 CEST] <ChocolateArmpits> stephen, 'und' is a language field in this case
[19:25:25 CEST] <ChocolateArmpits> not sure if ffprobe adds it itself, or the stream is actually marked with that
[19:26:21 CEST] <ChocolateArmpits> und is part of 639-2/3
[19:26:38 CEST] <ChocolateArmpits> >
[19:26:38 CEST] <ChocolateArmpits> und (for undetermined) is used in situations in which a language or languages must be indicated but the language cannot be identified.
[19:27:05 CEST] <stephen> 639-2/3, is this a spec you are quoting I might read?
[19:28:40 CEST] <stephen> Ah, ISO spec
[19:28:44 CEST] <stephen> language
[19:29:58 CEST] <stephen> I'll assume any property attached to a video discussed here whose value is defined outside the scope of mpeg or video will be based on an ISO spec?
[19:31:28 CEST] <ChocolateArmpits> yeah iso 639 and it has parts 1,2,3 etc
[19:33:12 CEST] <Bombo> alexpigment: i think it skips images
[19:33:48 CEST] <Bombo> can ffmpeg render something into images? like filename? or filedate would make more sense i think ;)
[23:18:41 CEST] <SpeakerToMeat> Stupid stupid question, if I input something in yuv420 and output in dpx with rgb48le pix fmt, color space is converted from yuv to rgb, right?
[23:20:12 CEST] <sfan5> of course it is
[23:20:19 CEST] <sfan5> otherwise the results would be totally wrong
[23:20:28 CEST] <SpeakerToMeat> Yeah
[23:20:45 CEST] <SpeakerToMeat> And from what I see in the pix formats, is there (no) way to output dpx in sRGB space?
[23:21:08 CEST] <SpeakerToMeat> Or is the space being used sRGB?
[23:24:27 CEST] <SpeakerToMeat> Or, use a 709 color space?
[23:44:02 CEST] <ChocolateArmpits> SpeakerToMeat, not sure if doable with color-model conversion if you need accurate results
[23:44:21 CEST] <ChocolateArmpits> there's colorspace filter, but it only outputs yuv
[23:44:51 CEST] <ChocolateArmpits> there's zscale filter based on zimg, but zscale doesn't seem to offer color model conversion in comparison to zimg
[23:49:29 CEST] <SpeakerToMeat> nod
[23:49:37 CEST] <SpeakerToMeat> Thank you CA
[00:00:00 CEST] --- Thu Apr 19 2018


More information about the Ffmpeg-devel-irc mailing list