Thursday, October 04, 2012

Making Music using Rosegarden on Fedora 17

I’ve always loved music – as do both of my parents. They have excellent, but divergent tastes in music. With my Mum I share a love of Sandy Denny, Jeff Lynne and George Harrison, with my father there was a shared affection for Eric Coates, Henry Hall and G. F. Handel. And when you mix the two together you get my love of Maestoso, Mike Oldfield, Kevin Ayers and Barclay James Harvest.

As well as listening to music, I also enjoy making it. But I always thought making a music on a computer seemed so difficult to do I never bothered really trying.

However, recently I got a bit of inspiration from my friend TA Walker (Tim). Earlier this year Tim signed up for something called the 5090 Challenge – writing 50 songs in 90 days. Given Tim has a full-time job, a wife and a young daughter that was insanely ambitious but astoundingly he managed 36 excellent songs which I have been known to raid for my YouTube videos. In order to reach his goal Tim was making music anywhere using anything – he was even overdubbing vocals and recording guitalele in his car during his lunch-breaks using an iPod Touch. Here is Tim playing one of his 5090 songs:


So, if Tim could make music in a car (or on a very nice looking white leather sofa) I had no excuse sitting in front of a computer that had access to a repository of free software for making noises.

I'm using Fedora 17 and I wanted to try and record music entirely using free software. This is because a) I'm on a budget of £0 and b) I think it’s the right thing to do.

Rosegarden running on Fedora 17

The first program I tried to install was something called Rosegarden. It seemed a pretty welcoming program for beginners as music programs go and therefore a good place to start. It used staves and notes – things that a dinosaur like me can (almost!) understand. However before I could get Rosegarden to make any noise I needed a synthesiser. I don’t have a real synthesiser, so instead I needed a soft synthesiser – a program that runs on the computer and pretends it’s a real synthesiser sitting on your table.

The synthesiser that everyone seemed to recommend was something called FluidSynth, so I thought I’d install that. FluidSynth is a free software synthesiser that can take MIDI data from a program like Rosegarden and turn it into audio.

It normally comes with a “SoundFont” bank containing a nice range of sounds for a beginner, so it seemed a good start. However to use FluidSynth it’s best to have a nice graphical interface so you can fiddle with it using knobs and buttons on your desktop. The most common one is called QSynth. It looks very impressive!

A very impressive addition to any desktop!

Only, before I could use the virtual synthesiser I needed something to plug it into the computer’s sound hardware. In other words, FluidSynth needs somewhere to send all this audio it's creating. That somewhere is a piece of software called the JACK Audio Connection Kit (JACK). But before I could use JACK I thought I'd find it easier if I something graphical to could control JACK with. So I needed something called QJackCtl – a graphical JACK Control panel.

QJackCtl with JACK not started

So I downloaded all the bits I needed. I had Rosegarden (a music studio), Fluid Synth (a synthesiser), JACK (a sound server), QJackCtl (a graphical interface for JACK) and QSynth (a graphical interface for FluidSynth). It was, literally, like a house that JACK built.

Now I tried to make a noise. I worked out after a couple of minutes that it’s not enough to simply load QJackCtl – JACK has to be started and stopped by pressing the Start and Stop buttons. So I tried to start JACK and it did nothing but spit error messages at me and I certainly couldn’t get anything to make any sound.

Now, this is where the cutting-edgeness of Fedora had just bitten me on the bum. Normally you should be able to start JACK and it will work without error. And indeed, since this morning’s software repository updates that’s exactly what it does do. However at that time there was a permissions problem within Fedora so I needed to type:

su -c "modprobe snd-seq-midi"

It took me an hour or so to find that out, and before I did so I couldn’t start JACK or make any noise at all. Normally I would have given up long before this point, but with M4 and Mr Cable The Sysadmin ringing in my ears I was determined and pressed on.

There were a couple of other things I had to do in JACK to get it to work. After pressing the Setup... button I had to uncheck Realtime, check Force 16bit and change the interface to hw:0.

QJackCtl with JACK started

With JACK running happily, I started QSynth to get FluidSynth running. Everything seemed OK, so the next step was to run Rosegarden. No problems. I opened one of the examples in the examples folder, pressed the play button and success! Music!

However, music on my headphones only – there was nothing coming out of my speakers. I went to QJackCtl and pressed the Connect button to see what was going on.

QSynth in headphone-only mode

As you can see, the left output of QSynth (l_00) was going to my system’s playback_1 and the right output of QSynth (r_00) was going to my system’s playback_2. This was giving me music in my headphones. However, what were the other playbacks?

QSynth will now use my speakers too

I tried to connect the left output of QSynth (l_00) to playback_3 and the right ouput (r_00) to playback_4, and it worked. Music through my speakers!

So every time I want to make music I…
  1. load QJackCtl, 
  2. start JACK by pressing the Start button, 
  3. load QSynth 
  4. then load Rosegarden
…always in that order.

Provided I just wanted to enter musical notation into Rosegarden I was now fine, but that’s not much fun. The frustrated Woolly Wolstenholme in me wanted to have a keyboard to play!

The trouble was as well as not having a synthesiser, I don’t have a keyboard either. Fortunately there are “virtual keyboards” available that allow you to play music using your computer’s keyboard. The one I chose out of a field of three was called Virtual MIDI Piano Keyboard (VMPK). I chose this one because it was the only one that seemed able to play nicely with the Hungarian keyboard on my computer.

Be Woolly in the comfort of your own home…

However, in order to record MIDI data created with a virtual keyboard meant I had to plug it into something that records MIDI data – Rosegarden. It was back to the QJackCtl Connect dialog:

VMPK running, but QJackCtl shows nothing to plug it into

VMPK had appeared in the MIDI tab of the QJackCtl Connect dialog. The trouble was, nothing else did – the only thing I could plug my virtual keyboard into was itself.

This proved to be a very tricky problem to sort out. It took me a long time to find an answer but the answer was a program called a2jmidid. Apparently there are two kinds of MIDI on a GNU/Linux machine – ALSA MIDI and JACK MIDI. They can't talk to each other without a “bridge” program. The bridge is called a2jmidid and it's available in the Fedora repository. To use it I had to start a terminal window and type:

a2jmidid

Then, provided I kept the terminal window open, when I go back to my QJackCtl Connect dialog I get some extra things in the MIDI tab:

VMPK connected to Rosegarden in QJackCtl

As you can see, I can now connect the VMPK Output to the Rosegarden record in and, hey bingo, I’ve got a MIDI keyboard connected to Rosegarden.
VMPK configured for a Magyar keyboard

The only thing left to do with VMPK is create a Hungarian key mapping – this was very easy to do using a dialog provided by the program.

The first thing I wanted to try and record was a tune I remembered from my childhood. It was an early music or baroque piece for recorder and a small ensemble used by the Open University before their morning broadcasts. I have never heard since early mornings in the 1980s when I used to get up early to watch a maths foundation course on calculus or the foundation course on modern art.

A lost childhood memory

I did a rather twee arrangement using a harpsichord, viola, cello and a recorder. I think the real thing was less Bach and more Henry VIII.

However when I came to play it the recorder just didn't sound right. It sounded very good, but it didn't sound like the recorder I had in my head. So I looked on-line to see if there were any other SoundFont banks I could use with QSynth.

I was in luck because the pianist and composer S. Christian Collins has put together an excellent SoundFont for bank for QSynth and put it on his website here. It’s called the GeneralUser GS Soundfont bank.

GeneralUser GS Soundfont bank loaded into QSynth

To load it I had to get QSynth running and press Setup…. Next, I had to go to the Soundfonts tab and replace the default soundfont bank (an sf2 file) with the GeneralUser GS SoundFont bank I had downloaded.

To my delight the recorder sounded much more how I wanted it to sound.

So now I  had finished and was happy with my sounds I realised I needed some way of recording what I’d just done as an audio file instead of a Rosegarden file.

When I ran QJackCtl with nothing else running the Connect dialog looked like this:

By default I can only get sound from the microphone

If you look at the Readable Clients box on the left you'll see the only place I could get any audio is from  capture_1 and capture_2. They represent the microphone socket on my computer. capture_1 is the left channel and capture_2 is the right channel of the stereo microphone input.

If I ran Rosegarden I found they were connected automatically to Rosegarden's record in 1 L and record in 1 R Input Ports:

Rosegarden connects to the microphone automatically

I looked in the QJackCtl Setup dialog and saw a Monitor check box which was unchecked. It sounded like what I needed so I checked it.

However you can enable monitoring

When I restarted JACK I saw this:

And route the monitor output where you want it

So now I have monitor output in addition to microphone input as a potential source of audio. What monitor output means is that I can record whatever I can hear through the speakers. This is just what I needed.

Such as here, where the monitor output is routed to Rosegarden

I started Rosegarden up again and connected up monitor_1 and monitor_2 to record_in_2 L and record_in_2_R.

This meant that now Rosegarden had the system’s sound output available as a source of audio. Now I could use Rosegarden to record whatever Rosegarden was playing as an audio file!

You can easily turn the metronome off in Rosegarden

Setting this up in Rosegarden is quite easy and pretty logical once you work it out (which took me a long time!). The first thing you need to do is go to Studio-> Manage Metronome to turn off the click track. You usually don’t want that on your master recordings!

The next thing you need is an audio track that can accept the monitor output as its audio input:

Rosegarden all set up to record Rosegarden

You can see in the picture above I've set track 17 as my current track. It’s an audio track and I’ve called it RECORD.

On the left hand side you'll notice that I've set the In: to In2. This is very important – In2 is the Rosegarden input we connected up to the monitor output in QJackCtl earlier. Never use In1 – it’s quiet and full of  interference noise!

Finally you'll notice I've armed track 17 to record – shown by the red light to the left of the track name. Now when I press the record button the my Rosegarden file will play and be recorded as an audio file on track 17 at the same time.

My recorded Rosegarden output in Rosegarden

When the track has finished you will see the waveform displayed in your recording track as it is above.

Double-clicking on an audio track segment in Rosegarden opens Audacity

Now you can double click on the recorded segment and it will open in Audacity. Don't forget to set Audacity to JACK Audio Output as I have in the picture above, or it will freeze and not play.  From Audacity you can edit or export the audio in the usual way.

For OGG Vorbis files or MP3 files I normalize to -2.0dB

I always save a lossless FLAC file from Audacity first. If I want to save to a lossy format such as OGG Vorbis or MP3 I always Normalize to -2.0 dB first before I export.

Being able to set Audacity to use JACK audio output is very handy – particularly if you find you want to listen to audio files while you are working.

So now I had a FLAC file, an Ogg Vorbis file and an mp3 file. The FLAC file was fine, but what I really wanted to do was get a picture in my mp3 and ogg files so they would be just like the ones I downloaded from TA Walker’s Bandcamp page.

To do this I found an excellent program called EasyTAG which does exactly what it’s name suggests. This program allows you to add a picture to your audio files and is very easy to use. Although I tend to use Ex Falso for most of my tagging (it’s better for classical music) I'll use EasyTAG for tagging my own files in future.



The next thing I decided to do was re-record the OU Tune in Mike Oldfield style. When I was a child I remember watching Simon Groom visit Mike Oldfield to see him re-record the Blue Peter signature tune. That video had a enormous effect on me as a child and recording something like that was something I always wanted to try.

I had a lot of fun in Rosegarden pretending to be Mike – particularly tapping away on my computer’s keyboard pretending to play the bodhrán.

When I finished Tim very kindly recorded a real electric guitar solo for me to add to my track. He supplied it to me as a FLAC file, but the funny thing was I could not find any way of importing a FLAC file into Rosegarden – only .WAV files.
TA Walker’s solo shown on the red track

However, by accident, I discovered you could import FLAC files directly into Rosegarden if you dragged and dropped them onto the time track.

I'd enjoyed myself so much with the Open Universtiy tune I decided to record another tune Mike Oldfield-stylee, so I dusted off my recording of Border Television’s Keltic Prelude March by L. E. DeFrancesco and did that as well!

The reason I did the Keltic Prelude March was so that I could upload my video of a Border Television start-up that I had pulled down earlier this year. The reason I had pulled it down was because of a copyright claim over the track Holiday People by James Clarke that I had used over the menu rundown. Therefore I decided I would also create a pastiche of Holiday People to use in my Border start-up. I came up with a tune I called Lionel’s Talking!

Lionel’s Talking in Hydrogen

I needed a “real” drum kit for Lionel’s Talking so I decided to use a special drum synthesiser called Hydrogen which does the job flawlessly. Hydrogen also works beautifully in tandem with Rosegarden. The Rosegarden website has a wonderful tutorial to explain how to do this here.

So put it all together and what do you have? Well something like this…


Producing music on GNU/Linux can be a bewildering and frustrating experience at first. There are so many things you need – each one has to be set up correctly in order to work properly. There is a huge amount to learn and sometimes you feel the easiest thing is to just give up. I spent a lot of time trying, and failing, to do things which I thought should have been easy.

In addition, differences in hardware mean what you have to do get everything working is slightly different for everyone.

But with a little perseverance you find that things rapidly begin to make sense, there is a common logic that underlies all the things you have to do and you begin working out answers to your problems yourself.

I hope you try making some music too.

Sunday, January 22, 2012

Synfig Studio Gate Weave

If I wasn't so stupid, I’d have realised you could add gate weave to my animations in Synfig Studio without the need for any coding whatsoever.

Here’s an example:


You simply need to add a translate layer to your animation. The translate layer is used to move things around the canvas. Here’s one in my layers panel:

Translate layer in layers panel


The translate layer should be at the top, so everything in your animation will weave (Z Depth 0.000000).

Next, you need to convert the Translate layer's Origin into a Composite. That means the X-axis and Y-axis values are separated instead of being a vector.

You then convert the X-axis and Y-axis values to Random. Put in some suitable values, such as these:

Example X and Y axis values

Export the resulting animation as video and that’s all there is to it.

Sunday, January 15, 2012

Weave All Wobbles

Back in July, I wrote about some of the techniques I used to simulate old 16mm film entirely using free software.

One technique I mentioned was using Kdenlive to simulate gate weave – that strangely pleasing effect whereby picture moves almost imperceptibly around the screen as you watch. If you’re not familiar with what gate weave looks like, here’s an example:


I mentioned in my previous article that I discovered I could simulate gate weave manually using the Kdenlive “Pan and Zoom” filter.

I did this by zooming in on my video slightly…

Video resized by 108% to zoom in on it slightly

…and then moving the picture around randomly at 5 frame intervals.

Keyframes added a five frame intervals; X and Y changed randomly

Once this is done, I could save this portion gate weave as a custom effect so I could re-use it:

Save effect button in Kdenlive
 
When you do this, your zoom, keyframes and random movements are stored in the ~/.kde/share/apps/kdenlive/effects folder as an XML file. The XML file created by Kdenlive for some manually created gate weave looks like this:
<effect tag="affine" type="custom" id="test">
<name>test</name>
<description>Adjust size and position of clip</description>
<author>Charles Yates</author>
 <parameter opacity="false" 
    default="0%,0%:100%x100%" 
    type="geometry" 
    value="-29,-23:778x622:100;
      25=-30,-25:778x622:100;
      50=-29,-22:778x622:100;
      75=-31,-24:778x622:100;
      100=-28,-24:778x622:100;
      123=-31,-21:778x622:100" 
    name="transition.geometry">
  <name>Rectangle</name>
 </parameter>
</effect>
Obviously, creating gate weave by hand for a long piece of video using the Kdenlive interface would take hours. Luckily, because the resulting gate weave custom effect is stored as a simple XML file, you can write a quick script to create the gate weave instead.

So I thought I'd use this post show you the script I use to create “automatic” gate weave.

When I do scripting jobs I prefer to use Python if possible. For this task I needed to get it to write out an XML file. Python comes with a selection of  complex but hugely flexible ways to do this. However, to make this as quick and easy as possible, I used a lovely Python module called pyfo which was developed by Luke Arno and did everything I needed.

Although you can install pyfo if you want to, there’s no need; you can just extract the pyfo.py file and put it in the same folder as your Python scripts that use it.

As you can see below, my Python script is pretty self-explanatory. The script below is suitable for adding gate weave to PAL 4:3 video.

The variable value_string determines how much the video is zoomed in by initially. For wide-screen PAL 16:9 video I adjust this value to "-29,-23:1034x622:100;". The value of the step_value variable determines how often the video frame moves. I think every 5 frames often works best.

You can run the script multiple times to create different xml files, but if you do remember that you will need to change the value of the custom_effect_name variable each time to something different so you’ll be able to tell your gate weave custom effects apart.

#!/usr/bin/env python2.4

""" This program is free software: you can redistribute it and/or modify
    it under the terms of the GNU General Public License as published by
    the Free Software Foundation, either version 3 of the License, or
    (at your option) any later version.

    This program is distributed in the hope that it will be useful,
    but WITHOUT ANY WARRANTY; without even the implied warranty of
    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    GNU General Public License for more details.

    You should have received a copy of the GNU General Public License
    along with this program.  If not, see http://www.gnu.org/licenses/."""

import random
from pyfo import pyfo

custom_effect_name = "weave"

#Number of seconds of gate weave required
frames_required = 750

#Initial value of frame
value_string = "-29,-23:778x622:100;"

#Origin
origin = {'x':-29, 'y':-23}

#Step value (in frames)
step_value = 5

#Maximum weave distances
max_distance = {'x': 1, 'y': 1}

for i in range(0, frames_required, step_value):
    x = random.randrange(origin['x'] - max_distance['x'],
            origin['x'] + max_distance['x'] + 1)
    y = random.randrange(origin['y'] - max_distance['y'], 
            origin['y'] + max_distance['y'] + 1)
    value_string += str(i) + "=" + str(x) + "," + str(y) + ":778x622:100;"

xml_output = \
    ('effect', [
    ('name', custom_effect_name),
    ('description', 'Adjust size and position of clip'),
    ('author', 'Charles Yates'),
    ('parameter', ('name', 'Rectangle'),
    {'opacity':'false', 'default':'0%,0%:100%x100%', 
    'type':'geometry', 'value':value_string, 'name':'transition.geometry'}),
    ], {'id':custom_effect_name, 'type':'custom', 'tag':'affine'})

result = pyfo(xml_output, pretty=True, prolog=False, encoding='ascii')
print result.encode('ascii', 'xmlcharrefreplace')

As it is, my Python script writes the XML file it produces to the console window. You can copy and paste the resulting XML into a blank text file and save it to ~/.kde/share/apps/kdenlive/effects folder,

However, you can pipe the output into straight into an XML file instead. For instance:
$ python gateweave.py > gateweave.xml
Obviously, my Gate Weave solution isn’t very elegant, but who cares – it works, and it’s all free software!

Saturday, January 14, 2012

ATV Yesterday and Today

If you’ve read my blog before, you may have come across some posts about my friend Roddy Buxton. Roddy is an incredibly inventive chap – he’s rather like Wallace and Grommit rolled into one! He has his own blog these days and I find everything on it fascinating.

One of Roddy’s cracking contraptions

One of the subjects recently covered on Roddy’s blog is the home-made telecine machine he built. The telecine was a device invented by John Logie-Baird at the very dawn of broadcasting (he began work on telecine back in the 1920s) for transferring pictures from film to television.

Roddy also shares my love of everything ATV, so naturally one of the first films Roddy used to demonstrate his telecine was a 16mm film copy of the ATV Today title sequence from 1976.

This title sequence was used from 1976-1979 and proved so iconic (no doubt helped immeasurably by the rather forgetful young lady who forgot to put her dress on) it is often used to herald items about ATV on ITV Central News. Sadly, as you can see below, the sequence was not created in widescreen so it usually looks pretty odd when it’s shown these days.

How the sequence looks when broadcast these days.

The quality of Roddy’s transfer was so good I thought it really lent itself to creating a genuine widescreen version. In addition, this would provide me with a perfect opportunity to learn some more about animating using the free software animation tool Synfig Studio.

The first thing to do when attempting an animation like this is to watch the source video frame by frame and jot down a list of key-frames – the frames where something starts or stops happening. I use a piece of free software called Avidemux to play video frame by frame. Avidemux is like a Swiss Army knife for video and I find it handy for all sorts of things.

Video in Avidemux

I write key-frame lists in text file that I keep with all the other files for a project. I used to jot the key frames down on a pad, but I’ve found using a text file has two important advantages: it’s neater and I can always find it! Here is my key frame list in Gedit, which is my favourite text editor:

Key-frame list in Gedit

After I have my key-frame list I then do any experimenting I need to do if there are any parts of the sequence I’m not sure how to achieve. It’s always good to do this before you start a lot of work on graphics or animation so that you don’t waste a lot of time creating things you can’t eventually use.

The ATV Today title sequence is mostly straightforward, as it uses techniques I’ve already used in the Spotlight South-West titles I created last year. However one thing I was not too sure about was how to key video onto the finished sequence.

Usually, when I have to create video keyed onto animation I cheat. Instead of keying, I make “cut-outs” (transparent areas) in my animation. I then export my animation as a PNG32 image sequence and play any video I need underneath it. This gives a perfect, fringeless key and was the technique I used for my News At One title sequence.

However, with this title sequence things were a bit trickier – I needed two key colours, as the titles often contained two completely different video sequences keyed onto it at the same time.

Two sequences keyed at once

Therefore I had to use chromakeying in Kdenlive using the “Blue Screen” filter, something I had never had a lot of success with before.

The first part was simple – I couldn’t key two different video sequences onto two different coloured keys at once in Kdenlive. Therefore I had to key the first colour, export the video losslessly (so I would get no compression artefacts), then key the second colour.

The harder part was making the key look smooth. Digital keying is an all or nothing affair, so what you key tends to have horrible pixellated edges.

Very nasty pixel stepping on the keyed video

The solution to this problem was obvious, so it took me quite a while to hit upon it! The ATV Today title sequence is standard definition PAL Widescreen. However, if I export my animation at 1080p HD and do my keys at HD they will have much nicer rounded edges as the pixels are “smaller”. I can then downscale my video to standard definition when I’ve done my keying and get the rounded effect I was after.

Smooth keying, without pixel stepping

The other thing I found is that keying in Kdenlive is very, very sensitive. I had to do lots of test renders on short sections as there was only one “Variance” setting (on a scale between 1 and 100) that was exactly right for each colour.

So now I was convinced I could actually produce the sequence, it was time to start drawing. I created all of my images for the sequence in Inkscape, which is a free software vector graphic tool based around the SVG standard.

However, in order to produce images in Inkscape I needed to take source images from the original video to trace over. I used Avidemux to do this. The slit masks that the film sequences are keyed on to are about four screens wide, so once I had exported all the images I was interested in I needed to stitch them together in the free software image editor The GIMP. Here is an example, picked totally at random:
She'll catch her death of cold…

Back in Inkscape I realised that the sequence was based around twenty stripes, so the first thing I did before I created all the slit mask images was created guides for each stripe:

These guides saved me a lot of time

The stripes were simply rounded rectangles that I drew in Inkscape. It didn't take long to trace all of the slit masks for the title sequence. Two of the masks were repeated, which meant that I didn’t have as many graphics to create as I was fearing.

Once the slit masks were out of the way I could create the smaller items such as the logo:

ATV Today logo created in Inkscape

And, with that, all the Inkscape drawing was done. It was time to animate my drawings now, so I needed to export my Inkscape drawings into Synfig Studio. To do this I was able to use nikitakit’s fantastic new Synfig Studio SIF file Exporter plug-in for Inkscape. This does a fabulous job of enabling Inkscape artwork to be used in Synfig Studio, and it will soon be included as standard in Inkscape releases.

When I did my Spotlight title sequence I exported (saved) all of my encapsulated canvases (akin to Symbols in Flash) that I needed to reuse within my main Synfig file. This was probably because I came to Synfig from Macromedia Flash and was used to the idea of having a large file containing all the library symbols it used internally.

I have been playing with Synfig Studio a lot more since then, and I realised a far more sensible way to work was to have each of what would have been my library symbols in Flash saved as separate Synfig files. Therefore I created eight separate Synfig Studio files for each part of the sequence and created a master file that imports them all and is used to render out the finished sequence.

The project structure

This meant that my finished sequence was made up of nine very simple Synfig animation files instead of one large and complicated one.

The animation itself mainly consisted of simply animating my Inkscape slit masks across the stage using linear interpolation (i.e. a regular speed of movement).

I could type my key-frames from my key-frame text file directly into the Synfig Studio key-frame list:

Key-frames for one part of the animation

The glow was added to the ATV Today logo using a “Fast Gaussian Blur”, and the colour was changed using the “Colour Correct” layer effect – exactly the same techniques I used in the Spotlight South-West titles.

ATV Today logo in Synfig

In order to improve the rendering speed I made sure I changed the “Amount” (visibility) of anything that was not on the stage at the present time to 0 so the renderer wouldn't bother trying to render. You do this using Constant interpolation so that the value is either 0 or 1.

I had a couple of very minor problems with Synfig when I was working on this animation. One thing that confused me sometimes was the misalignment of key-frame symbol between the Properties panel and the Timeline.

This misalignment can be very confusing

As you can see above, the misalignment gets greater the further down the “Properties Panel” something appears. This makes it quite hard at times to work out what is being animated.

Some very odd Length values indeed!

Another problem I had was that the key-frame panel shows strange values in the time of length columns - particularly if you forget to set your project to 25 frames per second at the outset.

However, overall I think Synfig Studio did brilliantly, and I would chose it over Flash if I had to create this sequence again and could choose any program to create it in.

The most important technical benefit of Synfig Studio for this job was the fact that it uses floating point precision for colour, so the glows on the ATV Today logo look far better than they would have done in Flash as the colour values would not be prematurely rounded before the final render.

I rendered out my Synfig Studio animation as video via ffmpeg using the HuffyUV lossless codec, and then I was ready to move onto Kdenlive and do the keying.

Obviously I needed some “film sequences” to key into the titles, but I only have a small selection of videos as I don't have a video camera. To capture video I use my Canon Ixus 65, which records MJPEG video at 640 x 480 resolution at 30fps.

My 16mm film camera

Bizarrely, when the progressive nature of its output is coupled with the fact it produces quite noisy pictures, I’ve found this makes it a perfect digital substitute for 16mm film camera!

I “filmised” all the keyed inserts, so that when they appear in the sequence they will have been filmised twice. Hopefully, this means I’ll get something like the degradation in quality you get when a film is then transferred to another film using an optical printer.

Once the keying was done the finished sequence was filmised entirely using Kdenlive using techniques I've already discussed here.

And so, here’s the finished sequence:


Although I’m not happy about the selection of clips I’ve used, I’m delighted with the actual animation itself. I’m also very pleased that I’ve completed another project entirely using free software. However, I think the final word should go to Roddy:

Thanks for the link. I had a bit of a lump in my throat, seeing those titles scrolling across, hearing the music, while munching on my Chicken and Chips Tea… blimey, I was expecting Crossroads to come on just after!
If you are interested in ATV, then why not buy yourself a copy of the documentary From ATV Land in Colour? Three years in the making, over four hours in duration, its contains extensive footage (some not seen for nearly fifty years) and over eleven hours of specially shot interviews edited into two DVDs.

Sunday, August 07, 2011

Sunday’s Newcomers

Click to enlarge

Going through my old Flash files, I stumbled across an early version of this image, which I first produced in 2005. I didn’t know how to make it look realistic then, but I’ve since been given lots of good advice from Rory Clark. This new version was produced in Inkscape and aged in The GIMP.

In case you’re wondering, these were all real IBA Transmitters.

Wednesday, August 03, 2011

Doing my pennants…

I often spend idle half hours looking around flickr for anything of interest. The other day I found a very nice Anglia logo from 1959. Obviously, I couldn’t resist recreating it in Inkscape while I was listening to a pod-cast:

Click to enlarge

This stylised Anglia pennant logo formed the basis of Anglia Television’s original end-caps, including the one seen on their opening program.