[FFmpeg-user] Use processed filename as draw text after tmix, in one pass?

Steven Kan steven at kan.org
Thu Apr 25 06:05:12 EEST 2024


> On Apr 24, 2024, at 11:24 AM, William C Bonner <wimbonner at gmail.com> wrote:
> 
> On Sun, Apr 21, 2024 at 12:08 PM Steven Kan <steven at kan.org> wrote:
> 
>>> On Mar 6, 2024, at 11:49 AM, Steven Kan <steven at kan.org> wrote:
>>> 
>>> Planning ahead to the successor to my honeycomb time-lapse video from
>> 2022, processed using tmix per advice from this group:
>>> 
>>> https://ffmpeg.org//pipermail/ffmpeg-user/2022-April/054742.html
>>> 
>>> https://www.youtube.com/watch?v=2dUGbGcGE2c
>>> 
>>> This time I’m capturing the photos with a dSLR and gphoto on a Raspberry
>> Pi, and the dSLR does not burn in a timestamp.
>>> 
>>> This is actually a good thing, because I don’t like how my timestamps
>> got tmixed away last time. I’d like to apply them after tmix.
>>> 
>>> I’m saving the photos with the timestamp as the sortable filename, e.g.
>>> 
>>> 2024-03-06-11-40-11.jpg
>>> 
>>> This time I’d like to tmix 50 frames, read the filename of the 50th
>> frame, re-arrange the text of the filename to U.S. style, e.g. "03/06/24,
>> 11:40:11 AM", and then drawtext it onto the output.
>>> 
>>> Can this be done in one pass? Or would I need to do a first pass to
>> create the text fields in some companion files, e.g.
>> 2024-03-06-11-40-11.txt, or even multiple passes to do the tmix first and
>> then the drawtext? I’d like to avoid multiple passes of video processing to
>> avoid generation loss, if possible.
>> 
>> Partial answer to my own question, from here:
>> 
>> 
>> https://superuser.com/questions/717103/burn-filenames-of-single-images-into-ffmpeg-output-video
>> 
>> So this command works:
>> 
>> ffmpeg -f image2  -export_path_metadata 1 -pattern_type glob -i '*.jpg'
>> -vf "drawtext=text='%{metadata\:lavf.image2dec.source_basename\:NA}':
>> fontfile=/System/Library/Fonts/Helvetica.ttc:fontcolor=white: fontsize=48:
>> x=(w-text_w)*0.01: y=(h-text_h)*0.98" -y CombLapseWithFilenames.mp4
>> 
>> and creates:
>> 
>> https://www.kan.org/download/CombLapseWithFilenames.mp4
>> 
>> But the drawtext is the original filenames, e.g. 2024-04-21-08-20-11.jpg,
>> which format I chose in my photo-taking script so that they’d sort properly.
>> 
>> Can I reformat that in U.S.-style, e.g. "04/21/24, 08:20:11” and strip the
>> .jpg extension, and do this all in one pass?
>> 
> 
> I'd recommend using the metadata for your timelapse if it's available
> instead of the filename. This is what I'm using in my windows project that
> calls ffmpeg to create movies from gopro time lapse photos.
> 
> drawtext=fontfile=C\\:/Windows/Fonts/consola.ttf:fontcolor=white:fontsize=main_h/16:y=main_h-text_h-50:x=50:text=%{metadata\\:DateTimeOriginal}

Thanks! This works, and I agree that it’s better than using the filename:

ffmpeg -pattern_type glob -i '*.jpg' -vf "drawtext=text='%{metadata\\:DateTimeOriginal}': fontfile=/System/Library/Fonts/Helvetica.ttc:fontcolor=white: fontsize=48: x=(w-text_w)*0.01: y=(h-text_h)*0.98" -y CombLapseWithTimeStamp.mp4

I can now add tmix after drawtext:

ffmpeg -pattern_type glob -i '*.jpg' -vf "drawtext=text='%{metadata\\:DateTimeOriginal}': fontfile=/System/Library/Fonts/Helvetica.ttc:fontcolor=white: fontsize=48: x=(w-text_w)*0.01: y=(h-text_h)*0.98, tmix=frames=10:weights='1'" -y CombLapseWithTimeStampAndTmix.mp4

And it renders, but the timestamps get blended.

Can I use “split” to make one stream of images from which I can extract the timestamp, and then another stream for tmix, and then overlay the timestamp after tmix? How would I sync up the two streams, since tmix would be N frames shorter than the original?

Thank you!!

ffmpeg -pattern_type glob -i '*.jpg' -vf "drawtext=text='%{metadata\\:DateTimeOriginal}': fontfile=/System/Library/Fonts/Helvetica.ttc:fontcolor=white: fontsize=48: x=(w-text_w)*0.01: y=(h-text_h)*0.98, tmix=frames=10:weights='1'" -y CombLapseWithTimeStampAndTmix.mp4
ffmpeg version N-109776-g7e1d474021-tessus  https://evermeet.cx/ffmpeg/  Copyright (c) 2000-2023 the FFmpeg developers
  built with Apple clang version 11.0.0 (clang-1100.0.33.17)
  configuration: --cc=/usr/bin/clang --prefix=/opt/ffmpeg --extra-version=tessus --enable-avisynth --enable-fontconfig --enable-gpl --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libfreetype --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libmysofa --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --enable-libopus --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvmaf --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-version3 --pkg-config-flags=--static --disable-ffplay
  libavutil      57. 44.100 / 57. 44.100
  libavcodec     59. 63.100 / 59. 63.100
  libavformat    59. 38.100 / 59. 38.100
  libavdevice    59.  8.101 / 59.  8.101
  libavfilter     8. 56.100 /  8. 56.100
  libswscale      6.  8.112 /  6.  8.112
  libswresample   4.  9.100 /  4.  9.100
  libpostproc    56.  7.100 / 56.  7.100
Input #0, image2, from '*.jpg':
  Duration: 00:00:00.40, start: 0.000000, bitrate: N/A
  Stream #0:0: Video: mjpeg (Baseline), yuvj422p(pc, bt470bg/unknown/unknown), 3008x2000, 25 fps, 25 tbr, 25 tbn
Stream mapping:
  Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
Press [q] to stop, [?] for help
[libx264 @ 0x7fddb8f07540] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2
[libx264 @ 0x7fddb8f07540] profile High 4:2:2, level 5.1, 4:2:2, 8-bit
[libx264 @ 0x7fddb8f07540] 264 - core 164 r3106 eaa68fa - H.264/MPEG-4 AVC codec - Copyleft 2003-2023 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=18 lookahead_threads=3 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'CombLapseWithTimeStampAndTmix.mp4':
  Metadata:
    encoder         : Lavf59.38.100
  Stream #0:0: Video: h264 (avc1 / 0x31637661), yuvj422p(pc, progressive), 3008x2000, q=2-31, 25 fps, 12800 tbn
    Metadata:
      encoder         : Lavc59.63.100 libx264
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
frame=   10 fps=6.1 q=-1.0 Lsize=    1293kB time=00:00:00.28 bitrate=37843.9kbits/s speed=0.171x    
video:1293kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.069884%
[libx264 @ 0x7fddb8f07540] frame I:1     Avg QP:23.93  size:642058
[libx264 @ 0x7fddb8f07540] frame P:9     Avg QP:24.96  size: 75652
[libx264 @ 0x7fddb8f07540] mb I  I16..4:  1.6% 90.1%  8.3%
[libx264 @ 0x7fddb8f07540] mb P  I16..4:  0.0%  0.6%  0.0%  P16..4: 59.0%  9.9%  7.5%  0.0%  0.0%    skip:22.9%
[libx264 @ 0x7fddb8f07540] 8x8 transform intra:90.0% inter:77.8%
[libx264 @ 0x7fddb8f07540] coded y,uvDC,uvAC intra: 95.0% 99.2% 74.9% inter: 29.7% 69.0% 1.9%
[libx264 @ 0x7fddb8f07540] i16 v,h,dc,p: 16% 33%  2% 49%
[libx264 @ 0x7fddb8f07540] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 20% 25% 13%  5%  7%  7%  7%  7% 10%
[libx264 @ 0x7fddb8f07540] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 20% 27%  8%  6%  9%  8%  8%  7%  7%
[libx264 @ 0x7fddb8f07540] i8c dc,h,v,p: 35% 28% 26% 11%
[libx264 @ 0x7fddb8f07540] Weighted P-Frames: Y:88.9% UV:88.9%
[libx264 @ 0x7fddb8f07540] ref P L0: 84.5%  9.9%  5.4%  0.1%  0.0%
[libx264 @ 0x7fddb8f07540] kb/s:26458.46



More information about the ffmpeg-user mailing list