[Ffmpeg-devel-irc] ffmpeg.log.20150908
burek
burek021 at gmail.com
Wed Sep 9 02:05:01 CEST 2015
[02:42:37 CEST] <PovAddict> I have an .m4a of unknown source
[02:42:50 CEST] <PovAddict> vlc can't play it, I think it might have DRM, how do I tell?
[02:44:09 CEST] <PovAddict> https://paste.kde.org/pkbcs5nse is that FairPlay?
[07:25:58 CEST] <portal> I'm having a slgiht issue with my mp4 encodes - I can't figure out if it's on the encode end or webserve rend. whatever I do, you cannot seek the MP4 past 2h52m
[07:26:45 CEST] <portal> i thought it might be the mod_h264 psudo streaming plugin, but now im leaning towards an encode issue with my ffmpeg cmd
[07:29:23 CEST] <relaxed> portal: what was your command?
[07:30:00 CEST] <portal> ffmpeg -y -i file.m3u8 -c copy -bsf:a aac_adtstoasc -movflags faststart file.mp4
[07:32:56 CEST] <satiender> Hi , Can we save video data before muxing , is it possible with ffmpeg
[07:35:39 CEST] <relaxed> portal: can you seek with other players?
[07:35:43 CEST] <portal> nope
[07:35:48 CEST] <portal> VLC stops and gives me a green screen
[07:35:55 CEST] <portal> once it reaches that point
[07:36:05 CEST] <portal> im wondering if its the M3U8 causing issues
[07:36:24 CEST] <portal> thing is, if i use ?start=3600 on the mp4 to start it an hour in, i can complete the file just fine
[07:39:06 CEST] <relaxed> Yeah, that's odd. Maybe look at "ffmpeg -h muxer=mp4"
[07:39:29 CEST] <relaxed> which version are you using?
[07:39:39 CEST] <portal> ffmpeg version 2.7.gi
[07:41:14 CEST] <portal> im already setting the moov atom at the begining of the file
[07:41:17 CEST] <portal> i wonder if that could be causing it
[07:41:38 CEST] <relaxed> No, that aids in seeking
[07:41:47 CEST] <portal> yeah tahts what i figured..
[07:42:10 CEST] <portal> any suggestions what i should be looking for there? ican't possible be the only one having this issue :P
[07:43:43 CEST] <portal> http://pastebin.com/ciGMLbX3
[07:43:52 CEST] <portal> thats the file metadata
[07:43:56 CEST] <portal> maybe you can see something there that i cant
[07:44:17 CEST] <portal> http://pastebin.com/Qt7mWF72
[07:44:22 CEST] <portal> ^ complete paste
[07:44:49 CEST] <relaxed> try remuxing with mp4box and see if that makes a difference
[07:46:41 CEST] <relaxed> or try with the latest ffmpeg, http://johnvansickle.com/ffmpeg/
[07:50:34 CEST] <portal> kernel too old for that version
[07:50:35 CEST] <portal> hm
[08:01:44 CEST] <portal> trying to remux with mp4box to see if it helps
[08:01:52 CEST] <portal> if it does, what would the solution be wihtout remuxing with mp4box?
[08:09:52 CEST] <relaxed> It might be a bug
[08:12:50 CEST] <portal> in the version of ffmpeg?
[08:15:26 CEST] <relaxed> could be
[08:16:58 CEST] <portal> weird
[08:17:05 CEST] <portal> its a pita to compile it oncentos as it is :P
[08:26:05 CEST] <portal> lol
[08:26:07 CEST] <portal> it was mod_h264
[08:26:07 CEST] <portal> pos
[08:50:23 CEST] <satiender> Portal: Can we save video data before muxing , is it possible with ffmpeg ??
[08:51:00 CEST] Last message repeated 1 time(s).
[08:52:07 CEST] <satiender> portal: Can we save video data before muxing , is it possible with ffmpeg
[08:52:32 CEST] <Max-P> yes, you can just use a raw format. But you can only output one stream doing that
[08:53:03 CEST] <satiender> Max-P : thanks for reply
[08:53:20 CEST] <satiender> my one confusion please solve
[08:53:59 CEST] <Max-P> What are you trying to do in summary?
[08:54:01 CEST] <satiender> MAx-P : I want save video data after encoding means before muxing
[08:54:24 CEST] <Max-P> So basically raw h264?
[08:54:39 CEST] <satiender> then what is size of that file as compare to ts file
[08:55:17 CEST] <satiender> I think that is extra large file as compare to ts
[08:55:32 CEST] <Max-P> "ffmpeg -i yourvideo.ts -an -c:v copy -f raw yourfile.h264X
[08:55:42 CEST] <satiender> thanks
[08:56:00 CEST] <Max-P> yourfile.h264" **, last X was meant to close que quotes
[08:56:11 CEST] <satiender> ok
[08:56:18 CEST] Action: Max-P throws phone at the wall
[08:56:49 CEST] <satiender> Max-P : please tell me I am right or wrong
[08:57:20 CEST] <Max-P> I seriously doubt it would make a difference, the ts file should only be very slightly larger
[08:57:34 CEST] <satiender> Max-P: i am trying to recorded only encoded data for fast video processing
[08:57:48 CEST] <satiender> first I record video from camera
[08:57:54 CEST] <satiender> then save as raw data
[08:58:13 CEST] <satiender> then again process video from ffmpeg overlay filters
[08:58:35 CEST] <satiender> then again process with blur filter
[08:58:50 CEST] <satiender> then I want save it in .mp4 format
[08:59:46 CEST] <satiender> I mean if we want process on video again and again then we will save it in raw data
[08:59:58 CEST] <satiender> Is it right or wrong
[09:00:04 CEST] <satiender> please help
[09:00:09 CEST] <Max-P> That's doable in one command, but it should work yes
[09:01:22 CEST] <Max-P> I mean, the intermediate files are going to be absolutely huge
[09:02:02 CEST] <satiender> Max-P : Then what I can do for fast processing and save memory
[09:02:09 CEST] <satiender> please help
[09:03:32 CEST] <Max-P> Okay, lets try to avoid the XY problem, forget everything and tell me more about what you have and what you want to produce as the final product
[09:03:49 CEST] <satiender> ok
[09:04:27 CEST] <satiender> Max-P : I have android app frankly.me you also download it from google play store
[09:04:59 CEST] <satiender> Max-P : that is a video processing app
[09:05:34 CEST] <satiender> Max-P : user Record the video from their phone camera and then
[09:05:43 CEST] <satiender> goto preview
[09:06:22 CEST] <satiender> Max-P : in preview that recorded video is playing
[09:07:27 CEST] <satiender> Max-P : then if that user swap the screen then the video filter is applied on that video
[09:08:00 CEST] <satiender> Max-P : just at a time swap the screen video's color is changed into grey
[09:08:33 CEST] <satiender> Max-P : then if that swap screen one more time then another filter is applied
[09:08:50 CEST] <satiender> Max-P : that process is very fast
[09:09:10 CEST] <satiender> Max-P : But video is processing again and again
[09:09:28 CEST] <satiender> Max-P : that all process is done with Opengl
[09:09:42 CEST] <satiender> Max-P : But I want to do this with opengl
[09:09:49 CEST] <satiender> sorry FFmpeg
[09:10:39 CEST] <satiender> Max-P : So , My plan I save that recorded video in raw format for fast video processing and then process it with filters
[09:11:01 CEST] <Max-P> Okay. So, for the preview, you want to do that outside of ffmpeg, in the app. I guess you already do that. Then you want to save the transformed video to another mp4 file with ffmpeg
[09:11:49 CEST] <Max-P> Are you using livavcodec/format or thr ffmpeg commandline binary?
[09:12:20 CEST] <satiender> I have .so files of ffmpeg for my android app
[09:12:28 CEST] <satiender> that is command line
[09:12:35 CEST] <satiender> with java
[09:13:11 CEST] <satiender> Max-P : Yes all is done with opengl but I want more processing with ffmpeg
[09:14:33 CEST] <satiender> My conclusion Question is that Can we process video like Opengl means very fast if we save data in raw or ts format
[09:14:42 CEST] <satiender> Max-P : Please help
[09:15:07 CEST] <Max-P> Ok. So forget the temporary files: do all the rendering in the app in realtime but don't save anything. When the user is done, take each frame of the input, apply the modification with opengl or whatever, then push them to h264 video stream in a mp4 container
[09:16:22 CEST] <satiender> Max-P : yes you are right
[09:16:41 CEST] <satiender> Max-P : Can we do same process with ffmpeg
[09:17:00 CEST] <Max-P> Actually I would not use ffmpeg at all. Android already support for encoding mp4 files
[09:17:43 CEST] <Max-P> Which is probably much, much better as it will use the hardware h264 chip
[09:18:10 CEST] <Max-P> Why do you want to use ffmpeg exactly?
[09:18:44 CEST] <satiender> Because ffmpeg have large no. of filters which make more crazy videos
[09:19:10 CEST] <satiender> like hstack , vstack filters these are awesome
[09:19:51 CEST] <satiender> filters
[09:21:56 CEST] <Max-P> Humm, I think you can use libavfilter alone to process the image using Android native decoders
[09:23:21 CEST] <Max-P> I'm not sure I understand what you want, but the process is basically three steps: 1) decode the input video 2) apply filters 3) encode back to mp4
[09:24:05 CEST] <satiender> Max-P : Yes ! Sir you pick perfectly
[09:24:29 CEST] <satiender> Max-P : but step two will be done by user again and again
[09:24:52 CEST] <Max-P> The thing is you don't have to do it for the entire video
[09:25:05 CEST] <Max-P> You have access to each individual video frames
[09:26:09 CEST] <satiender> Max-P : By using ffmpeg I want skip decoding process for process video again and again for fast processing with save raw data
[09:26:26 CEST] <satiender> of video after recording
[09:26:43 CEST] <Max-P> For the preview you do step 1&2 as is, but instead of encosing it back you display the frame
[09:27:05 CEST] <Max-P> You can change the filters as much as you want, the user sees the video playing in realtime
[09:27:26 CEST] <satiender> ok
[09:27:28 CEST] <satiender> Nice
[09:28:06 CEST] <satiender> Max-P : One another thing suppose video is in preview
[09:28:26 CEST] <satiender> Max-P : user swap the screen filter is applied
[09:29:36 CEST] <satiender> Max-P : again screen is swap by user another filter is applied but I want that applied on current position of video just in preview time
[09:29:53 CEST] <satiender> is it possible ??
[09:31:21 CEST] <Max-P> Yes. Just record the events to remember what the user did when, then when the user saves it you replay it all but encoding to mp4 as fast as you can (instead of displaying it)
[09:33:29 CEST] <Max-P> Do you understand how the library works a little bit or how video processing work in general?
[09:37:33 CEST] <Max-P> Because I think you don't realize how much control you have :)
[09:38:55 CEST] <Max-P> The decoding process takes the mp4 file and gives you raw bitmaps for each frame of the video. Think of like a giant array with each frame of the video. You can almost access them however you wish.
[09:40:55 CEST] <Max-P> It's up to you what you want with those bitmaps. Right now you want to take that image, apply filters to it. You are also free to take that image, display it on the screen and wait 1/30 second and do that again with the next frame. You can also optionally take that image again and pass it to the encoder, which will put the images and encode them into h264/mp4 files
[09:42:11 CEST] <Max-P> You are applying the filters as it's playing, so if the user changes the filter midway through the video and you are both displaying and encoding, then the resulting file would be exacly what the user saw in the preview, as you essentially recorded the preview. Is that what you want?
[09:43:34 CEST] <satiender> Max-P : Thanks for very best info
[09:44:57 CEST] <satiender> Max-P : But if frame is previewing then same time Encoding of that frame is possible
[09:46:29 CEST] <satiender> Max-P : I want previewing filter on current frame but encoding with filter from 1st frame
[09:47:03 CEST] <Max-P> Yes, as I said you access each individual image, so you can do whatever you want with it
[09:47:49 CEST] <Max-P> So in this case it's even simpler. When the user is done, you go back to frame 1, reapply the filter from it, and put it to an output encoder stream instead of displaying it
[09:50:50 CEST] <satiender> Max-P : Suppose user is on half position of video at preview , then I apply filter on 1st frame i that situation I think there have some delay for preview applied filter on users current position
[09:51:38 CEST] <satiender> Max-P : what you think ??
[09:51:51 CEST] <Max-P> But you can do both ._.
[09:52:06 CEST] <satiender> yes
[09:52:43 CEST] <Max-P> When *previewing*, You do it live. You take a frame, apply the _current_ filter, and display it. Every 1/30th of a second
[09:53:01 CEST] <satiender> thanks
[09:53:26 CEST] <satiender> Max-P: Please you can give any example for that
[09:53:33 CEST] <satiender> please
[09:53:36 CEST] <Max-P> When the user has chosen his filter and clicks *save*, you go back to the beginning of the input, apply the selected filter to the frame, and push it to an output stream
[09:54:02 CEST] <Max-P> Thus in preview mode it's fast, nothing is saved. When done, you redo it in the background, once, and process the whole video.
[09:54:36 CEST] <satiender> Yes ! Correct
[09:54:55 CEST] <Max-P> I still don't understand what the problem is
[09:56:20 CEST] <satiender> Problem is that Can we apply filters one after one on Current position of video at previewing time with ffmpeg
[09:58:47 CEST] <Max-P> Why not? You get the bitmap in *your* code. You don't tell to make a preview or anything. *You* connect the pieces together as you please
[10:00:17 CEST] <satiender> Max-P : Can you give me ffmpeg example for that , that is very helpful for me .
[10:00:40 CEST] <satiender> Max-P : And I am big thankful to you for that
[10:00:44 CEST] <satiender> please
[10:01:11 CEST] <Max-P> https://www.ffmpeg.org/doxygen/trunk/filtering_8c-source.html
[10:03:19 CEST] <satiender> But that is C program of filter
[10:06:19 CEST] <satiender> Max-P : Max please help
[10:06:25 CEST] <satiender> bro
[10:06:33 CEST] <satiender> :(
[10:08:38 CEST] <Max-P> satiender: You have to do it in C.
[10:08:49 CEST] <Max-P> ffmpeg is a C library
[10:09:21 CEST] <satiender> Max-P : Yes I am proficient in C
[10:09:49 CEST] <satiender> But actually I am doing that on Android which is in java
[10:10:11 CEST] <satiender> there have native support for that but that is very complex
[10:10:18 CEST] <Max-P> You have to use the NDK and use JNI on the Java side to talk to the C functions. You'll have to do that yourself using some data structure you'll define for it
[10:12:03 CEST] <Max-P> https://github.com/roman10/android-ffmpeg-tutorial This is the best I can find, I think example 2 is a video player. And it's all in the NDK.
[10:13:42 CEST] <Max-P> You should probably ask #android for that, that's more of a Android/NDK/Java than a ffmpeg problem, we can help you with the libav* APIs but I have no idea how to actually do that on Android
[10:14:24 CEST] <satiender> Max-P : Yes Sir you give me very good and knowledgeable info
[10:15:42 CEST] <satiender> Max-P : Can I connect to you on gmail or other communication sites for privacy
[10:15:53 CEST] <Max-P> no
[10:17:46 CEST] <satiender> Max-P : ok
[10:17:55 CEST] <Max-P> I come and go on my free time when I feel like helping/talking to people in some subjects such as here, but I don't have time to do personal support
[10:18:53 CEST] <satiender> Max-P : yes ! you are right sir
[10:19:11 CEST] <Max-P> And I don't really know libav* but more the ffmpeg command line utility, and I don't know much about Java and Android anyway either so I couldn't help you anyway
[10:20:17 CEST] <satiender> ok
[10:21:27 CEST] <Max-P> You really should ask the right people in #android-dev, they'll more likely to be able to help you than us. You need to learn how to use the NDK and bridge the Java side to the C side.
[10:21:53 CEST] <satiender> Max-P : yes
[10:22:10 CEST] <Max-P> I can point you to the documentation for the various components, but you'll have to learn how to connect those together
[10:22:34 CEST] <satiender> Max-P : Thanks
[10:22:44 CEST] <satiender> Max-P : I am happy for that
[10:25:05 CEST] <satiender> Max-P : my one question sir to you for video processing
[10:25:32 CEST] <satiender> Max-P : Can we develop our own filter without use of any library
[10:25:44 CEST] <satiender> ??
[10:26:02 CEST] <Max-P> Sure. You get each frame as an image, you do whatever you want with it
[10:27:04 CEST] <satiender> Please give any tutorial link for that
[10:28:20 CEST] <Max-P> Want to flip it? Easy. For a 200x100 image, you take image[0,0] and put it to image[0, 99], image[1,0] to image[1, 99] up to x = (200/2), you just vflip the image
[10:28:41 CEST] <Max-P> How you process it, that's up to you to come with something interesting
[10:29:52 CEST] <Max-P> You can change the colors, draw something on top of it, remove colors, blur it.
[10:31:18 CEST] <satiender> ok Nice sir
[10:34:18 CEST] <satiender> Max-P : I want blur and change the color of video
[10:35:01 CEST] <Max-P> Then use one of the AVFilter one, we have both
[10:35:01 CEST] <satiender> I search many more on google but Not get any idea for that implementation in C
[10:35:26 CEST] <satiender> yes I did it vith ffmpeg
[10:35:33 CEST] <satiender> sorry ffmpeg
[10:35:38 CEST] <satiender> with
[10:35:58 CEST] <Max-P> Or do it yourself. Want to add more green? Take red, remove a little. Take green, add a little. Take blue, remove a little
[10:35:59 CEST] <satiender> but if I want implement that in C
[10:37:30 CEST] <Max-P> Look at the example I linked above. It has everything you need
[10:38:00 CEST] <Max-P> It tells you how to build ffmpeg for Android, how to add it, it even shows how to call the C functions from the Java code (in MainActivity.java)
[10:38:53 CEST] <Max-P> Do it in C, or if you don't know C learn C and then do it in C, and then push back the result to Java like in the example
[10:40:38 CEST] <Max-P> And you really can't escape it. There's no native Java bindings for FFmpeg, and there probably never will because it's just better to reimplement the whole thing in Java
[10:41:55 CEST] <satiender> https://github.com/roman10/android-ffmpeg-tutorial that link
[10:42:03 CEST] <Max-P> Yes, that link
[10:42:08 CEST] <satiender> ok
[10:42:52 CEST] <satiender> Actually , if I compile ffmpeg for android the the size of my app extend
[10:43:00 CEST] <satiender> but I want less size
[10:43:23 CEST] <satiender> So I want only those things which are used in my app
[10:44:33 CEST] <satiender> So , that is reason I asking can we write own blur or color filter for that purpose
[10:46:10 CEST] <Max-P> As I said at the very beginning, Android also has native methods to do that. So we circle back to the beginning, but the process is the exact same: decode video frames, process it, then display and/or save it
[10:46:46 CEST] <satiender> ok
[10:46:47 CEST] <Max-P> But you said you wanted ffmpeg because of the filters
[10:47:03 CEST] <Max-P> So I tell you how to do it in ffmpeg
[10:47:25 CEST] <satiender> Max-P : yes I know
[10:47:40 CEST] <Max-P> Now you say you want to write your own filter, alright, replace avcodec by your own
[10:48:07 CEST] <Max-P> And now you're telling me ffmpeg is too big...
[10:49:40 CEST] <satiender> Max-P : Actually writing own filter idea coming in my mind few mint ago after asked to you native codes
[10:51:36 CEST] <satiender> if you can do that with efficient way then we can overcome the memory consumption
[10:51:46 CEST] <satiender> of android phone
[10:58:37 CEST] <satiender> Max-P : big thank you for help
[10:58:41 CEST] <satiender> :)
[12:35:01 CEST] <satiender> Max-P : I not understand how we can check video current position
[14:22:51 CEST] <Arwalk> Hi. Anyone proficient with ffmpeg's C library and ready to help me?
[14:23:25 CEST] <sagax> just write question
[14:23:43 CEST] <Timster> Hey, guys. Is ffmpeg capable of doing Interpolation?
[14:52:59 CEST] <durandal_1707> Timster: interpolation of what?
[14:58:25 CEST] <DHE> A framerate doubler that produces unique new frames?
[14:58:29 CEST] <DHE> (That's my guess)
[15:15:05 CEST] <Timster> durandal_1707, DHE - yes, a frame doubler to increase FPS
[15:23:24 CEST] <iive> i guess this is mcfps that michaelni is working on.
[15:37:44 CEST] <durandal_1707> there is also framerate filter
[16:01:24 CEST] <Timster> What is mcfps, iive ?
[16:01:56 CEST] <iive> video filter
[16:02:19 CEST] <Timster> But it's not yet in the builds, I guess?
[16:21:10 CEST] <ace040> hello!
[16:22:05 CEST] <ace040> I am currently developing closed-source/proprietary Windows software that is meant to author video
[16:22:18 CEST] <ace040> files are created and encoded with the help of LGPL-licensed FFmpeg DLLs
[16:22:58 CEST] <ace040> for H.264 support, I need to link a commercially-licensed build of libx264 (also distributed as a DLL) against our FFmpeg build
[16:23:30 CEST] <ace040> so basically I need to build x264 with --disable-gpl and FFmpeg with --enable-libx264 but WITHOUT --enable-gpl (obviously)
[16:24:07 CEST] <ace040> the vanilla configure from FFmpeg 2.7.2 does not allow this combination, so what I've done is comment out line 4537 as follows:
[16:24:24 CEST] <ace040> die_license_disabled gpl libx264 -> #die_license_disabled gpl libx264
[16:24:47 CEST] <JEEB> is the libx264 wrapper LGPL or GPL?
[16:24:47 CEST] <ace040> finally my question being: does this workaround sounds ok to you, is there another intended procedure to use not-GPL libx264 with LGPL FFmpeg?
[16:25:07 CEST] <JEEB> and you should bring your issue to either #ffmpeg-devel or the ffmpeg-devel mailing list. I don't think anyone so far has brought this use case up
[16:25:30 CEST] <JEEB> the configure is currently written for GPL x264 usage only, but it definitely is a valid use case
[16:25:38 CEST] <ace040> ok, isn't ffmpeg-devel related to internal development only?
[16:25:49 CEST] <JEEB> well this is pretty much that :P how to handle non-GPL x264
[16:26:02 CEST] <JEEB> most people just use libx264 APIs themselves not going through libavcodec for it
[16:26:16 CEST] <JEEB> and then handle the things around it with libav*
[16:26:33 CEST] <JEEB> like decode with libavcodec, pass to libx264, get stuff from libx264, mux with libavformat
[16:26:53 CEST] <ace040> JEEB> yes, actually I found one mention of such a use case in the mailing list archives: https://ffmpeg.org/pipermail/ffmpeg-devel/2010-December/083754.html
[16:28:08 CEST] <ace040> oh ok I get it, so we wouldn't need the wrapper then
[16:28:29 CEST] <JEEB> but yeah, definitely bring this up on #ffmpeg-devel again since after passing through some replies I don't see any actions taken
[16:28:36 CEST] <JEEB> or ffmpeg-devel ML
[16:28:38 CEST] <JEEB> whichever you prefer
[16:29:42 CEST] <c_14> The wrapper has the LGPL license header, so it should be fine. You should probably bring it up anyway though.
[16:29:54 CEST] <JEEB> yeah, I read the thread up
[16:30:00 CEST] <JEEB> nothing ended up done
[16:30:22 CEST] <JEEB> basically, opinions and even patches were posted, but nothing was done
[16:30:35 CEST] <JEEB> and yeah, if the wrapper is LGPL then it should probably be fine
[16:33:31 CEST] <ace040> ok, I will ask #ffmpeg-devel for this then
[16:33:38 CEST] <ace040> thanks a lot folks
[16:43:25 CEST] <dindu> is there a cluster ffmepg encoding ? script
[16:55:29 CEST] <DHE> someone was asking about having several systems contribute to encoding one large video file. is this what you're asking?
[16:56:33 CEST] <dindu> yes, i have 20 machines and looking for encode single video on all of them.
[18:37:09 CEST] <Arwalk> Hey guys. Anyone proficient with ffmpeg's C library and ready to help me?
[18:45:35 CEST] <Arwalk> Please?
[18:46:49 CEST] <durandal_1707> Yes?
[19:38:51 CEST] <tlhiv_work> i modified ffmpeg.c about three years ago (version 0.10.7) to include this code --> http://pastebin.tlhiv.org/4s_Cu9am ... the purpose was to be able to press the SPACEBAR while recording to store the times (in seconds) throughout the recording as "break points" ... i tried using the same code in version 2.7.2 and i'm getting this error --> http://pastebin.tlhiv.org/zXsDXjD2 ... i would appreciate any insight into solving this problem
[19:40:10 CEST] <tlhiv_work> sorry ... this is the error (when trying to compile) --> http://pastebin.tlhiv.org/sbpkzykQ
[19:40:18 CEST] <tlhiv_work> ignore the http://pastebin.tlhiv.org/4s_Cu9am paste
[20:05:19 CEST] <tlhiv_work> found the problem
[20:47:48 CEST] <Azelphur> Hey folks, I'm using Emby (for anyone who hasn't heard of it it's a web service that transcodes media on the fly to webm and streams it in a web browser, it uses ffmpeg to achieve this) I'm wondering if there's any way to shift this load onto the GPU, does ffmpeg support this yet?
[20:49:44 CEST] <bblinder> it's been available for a while, I think?
[20:49:47 CEST] <DHE> ffmpeg only supports a few filters (2 maybe?) with opencl
[20:49:49 CEST] <bblinder> https://trac.ffmpeg.org/wiki/HWAccelIntro
[20:50:20 CEST] <Azelphur> Interesting, makes me wonder why emby is having so much trouble with it, it has been a long time and the devs seem to be struggling
[20:51:14 CEST] <bblinder> I think it's also limited to the actual hardware
[20:53:01 CEST] <DHE> are any webm codecs actually offloaded though? x264 is the only one I know of with opencl and it's not actually a webm codec (right?)
[20:53:55 CEST] <JEEB> Azelphur: because GPUs are good at pictures, they're not good at lossy compression formats
[20:54:07 CEST] <JEEB> which is what all of the vendors noticed some time ago after selling their kool-aid
[20:54:31 CEST] <JEEB> which is why intel, nvidia and amd all now have specific encoder hardware (ASICs) on their things
[20:54:42 CEST] <JEEB> instead of trying to do it on the general purpose part of the GPU
[20:55:22 CEST] <JEEB> the opencl thing in x264 works kind of... but it's just a good way to get a very expensive heater rather than giving you something really useful
[20:55:38 CEST] <JEEB> it's a proof of concept that a small part of x264 you can run in a separate thread on the GPU
[20:55:53 CEST] <Azelphur> I see
[20:57:23 CEST] <JEEB> and regarding vp8 (I think that's what it's using for webm? because vp9 is pretty much unused and really slow unless you go out of your way to remove features)
[20:57:39 CEST] <JEEB> that might have some encoding stuff in some GPUs, but I have no idea
[20:58:36 CEST] <Azelphur> Sounds like it's a lot more complicated than I thought it would be, :)
[20:59:40 CEST] <JEEB> basically doing something on the general purpose part of the GPU that is encoding makes no sense, but if you decode the video stream with a dedicated component and encode it with such, then you might as well do some video filtering on that part
[20:59:55 CEST] <JEEB> that way you wouldn't have to move stuff between RAM and VRAM, either
[21:00:15 CEST] <JEEB> which often was a bottleneck with the other things that were tried during the years
[21:01:28 CEST] <feliwir> hey i wanted to ask if it's ok to put the ffmpeg tarball inside a public repository?
[21:03:32 CEST] <feliwir> and is static linking on windows allowed or not? The faq says it isn't
[21:04:01 CEST] <feliwir> https://www.ffmpeg.org/legal.html
[21:04:01 CEST] <c_14> Assuming you mean the source tarball, yes. (as long as you don't mess with the license headers etc)
[21:05:25 CEST] <c_14> And you might need a lawyer for the second. (I honestly don't know, but there's probably a reason it's listed there)
[21:06:56 CEST] <feliwir> but why is static compiling on windows possible at all then? And who would sue me if i do static link?
[21:07:41 CEST] <DHE> doing a static link is fine by itself, but the result may not be redistributable
[21:09:15 CEST] <feliwir> DHE, how you mean that? I thought i french for example there are no software patens
[21:09:23 CEST] <feliwir> *patents
[21:09:54 CEST] <c_14> Patents != Licensing
[21:11:43 CEST] <feliwir> uhm, i don't get where the difference should be if i use static or dynamic
[21:14:02 CEST] <DHE> when you link statically and some code is GPL, you end up with the whole EXE being GPL licensed which may conflict with other licenses. the result can't be redistributed while honouring the license
[21:16:01 CEST] <feliwir> ah that makes sense
[21:16:14 CEST] <feliwir> but i thought everything i use is either gpl or lgpl
[21:16:19 CEST] <feliwir> *think
[22:03:16 CEST] <llogan> tried 3 different computers and this intensity shuttle usb3 doesn't output anything...
[22:18:23 CEST] <iive> ?
[22:19:10 CEST] <llogan> just bitchin
[22:19:41 CEST] <iive> you have some usb3 device you are testing?
[22:20:11 CEST] <cbsrobot-> if you have no one to troll, start bitchin
[22:22:40 CEST] <llogan> yeah, some decklink garbage. i'll return it.
[22:23:24 CEST] <llogan> from blackmagic
[22:23:44 CEST] <cbsrobot-> did you test it on all platforms ?
[22:24:43 CEST] <llogan> two windows and a linux
[22:25:31 CEST] <iive> yeh... looks like defective device...
[22:25:45 CEST] <iive> is it even discovered by lsusb ?
[22:26:35 CEST] <llogan> yes
[22:28:17 CEST] <cbsrobot-> even audio is not working ?
[22:29:06 CEST] <llogan> no "we get signal", but i'm done screwing with it. just got an RMA, and in the box it goes. it's not really needed for this project anyway
[22:50:03 CEST] <Mavrik> llogan, it's only compatible with some USB chipsets
[22:50:11 CEST] <Mavrik> I had that PoS :P
[22:50:29 CEST] <Mavrik> The thing is, most USB chipsets don't have enough bus bandwidth so it doesn't work :)
[22:51:58 CEST] <llogan> Mavrik: fun times.
[22:52:11 CEST] <Mavrik> I did manage to find a PCIe 4x card that took it ;)
[22:52:18 CEST] <llogan> although this looked interesting http://git.sesse.net/?p=bmusb;a=blob;f=README
[22:54:19 CEST] <llogan> i just needed to capture some VHS tapes, and wanted to try something other than this ancient canopus advc110 (DV output), but i ran out of rat asses to give.
[22:54:47 CEST] <llogan> they'll look shitty enough that it won't matter
[22:55:20 CEST] <Mavrik> Yeah, also as far as I can remember the shuttle thing outright refused to capture anything that's not 720p or 1080i/p at some very standard FPS
[22:59:14 CEST] <Chocola4> Is anyone familiar with how timecode is counted at 23.976p ?
[23:00:02 CEST] <Chocola4> the timecode produced by the camera uses 24p to produce the timecode which is not exactly correct with 23.976
[00:00:00 CEST] --- Wed Sep 9 2015
More information about the Ffmpeg-devel-irc
mailing list