Friday, May 29, 2009

Technical Inspiration:

Here are some tutorials that I am particularly fond of that have and will be inspiring the techniques I develop this summer for my live performance set.  Most of these feature clever software tricks, custom software implementation, and the use of software features that are often overlooked but carry much potential.  

These videos are HIGHLY RECOMMENDED to anyone just using Ableton Live for their live music sets; as well as those interested in doing live visuals.  So check them out!!!

Ableton Tutorials:


This video introduces the basic concepts of Dummy Clips in Ableton Live utilizing the Chain Selector.  Dummy Clips allow you to create extensive preprogrammed (and sequenced/automated) effects that can be launched as clips!  (By the way, if you haven't figured out how to use chains within Live's Racks, you really should!  The Covert Operators have a great tutorial and explanation on the topic here!)


Part 2 goes more in-depth to show you how you can use automation envelopes in your Dummy Clips to create dynamic filter sequences.  Pay special attention in the second half of the video as they demonstrate an automation recording trick, switching between the arrange and clip views, that has many uses beyond creating Dummy Clips.  

I actually use this automation recording technique when working with external synths and drum machines in order to instantly record into Live all the MIDI CC parameters that I might want to automate.  I find this especially useful when recording sequences that I preprogrammed on my Elektron Machinedrum as (in addition to recording the MIDI note sequences) it allows me to instantly record the MIDI CC automations created with the special Parameter Lock/Slide functions of the machine.


Though this tutorial is a bit long, it really gives a better idea of many of the cool things you can do with Dummy Clips.  It also delves into using Follow Actions in combination with Dummy Clips which looks quite useful for making a dummy clip automatically reset after it has played through, as well as creating random jumps through the Dummy Clips for creating generative effect sequences.

Tom Cosm also has a lot of other interesting Tutorials worth checking out.  If you are new to using Ableton Live to perform with, his tutorial Moving from the studio to live performance using Ableton Live covers some good fundemantal techniques for building your p.a.  Or if you want to drool over Akai's new APC 40, check out Akai APC40 - Opening The Box.  


In addition to using dummy tracks to control effects and automations within Live, I plan to port these automations out to Max/MSP/Jitter or Modul8 to control effects upon my video as well.  For this, I am adapting strategies and custom software presented in the following tutorial.

A/V Sync Tutorial:


Please excuse the poor production of this video.  I recommend going to his tutorial page here in order to be able to fully understand what's going on in the video, as well as download the necessary tools to make this work.  For those of you fairly experienced with Max/MSP/Jitter and OSC,  you should be able to quickly figure out how to port this over and receive the OSC messages in Max.

For everyone else (most of you?!) let this video serve as a preview for my own solution to this problem for use with Modul8 or Max 5 that I will be releasing soon for free along with a tutorial on how to use it.  I have built a modified and expanded version of digital funfair's Pluggo VST that will port multiple OSC parameters (either 8 or 16 per plugin instance!) from Live to Modul8 through a configurable Max 5 router (or possibly/alternatively OSCulator).  It will allow you to send and scale Clip Automation Envelopes in high resolution from Ableton to Modul8 using completely free software (Cycling 74's Pluggo Jr. and Max 5 Runtime).  So essentially you will be able to use Ableton's Clip Automation Envelopes to automate effects in your video, the same as you can with audio, all perfectly synced and with extremely high resolution (16,383 steps as opposed to MIDI's 127 steps).  Additionally, you could port MIDI or live controller information this way as well... (which might seem pointless except that it opens up the possibility of using a wireless network to sync music and visuals between multiple computers, among other cool things!)

Lemur Control Tutorial:


I was hoping to have the new APC40 today, but sadly it seems Akai promised A LOT MORE units to be shipped than they actually produced in their first run so like many I will probably have to wait until later this summer to get my hands on one.  In lieu of this I think I will probably make use of this awesome new Lemur patch/interface made by The Covert Operators to control clips in Live.


I will post more tutorials in the future, including some produced by me.  Please feel free to comment and ask questions about anything I've posted.  Thanks!
-BM

Tuesday, May 19, 2009

"Put It On Ya" Music Video (2008)



Only a week after finishing this video back in May 2008, my hard drive was stolen/lost at a gig in Toronto.  To my great happiness, I recently found a lost DV tape that had the final edit of the video on it and I just encoded and uploaded it to my vimeo page.  

This video contains a lot of my original ideas of synchresis (synthesizing synchronicity between sound and vison, term coined by Michel Chion) and includes some of the footage that I used for my live shows, especially the big Mutek show.  I'd say that in a lot of ways my aesthetic has changed and expanded but the use of layering and keyed/matted images is still one of my core concerns.  

In digital video, one can create a video file that contains an Alpha layer (RGB+A) which will automatically set  transparency in your video, allowing your desired subject(s) to float on top of other video layers.  Unfortunately there are only a few codecs that support this (Animation+, PNG, TGA, Sheer+) and they come at the cost of creating much larger files than normal RGB video files which can take a serious hit on the CPU.  I will be exploring these formats and their use more during this project and provide some perspective on which ones seem to work best.

A somewhat more in-depth reading on the subject can be found here: Alphas Made Transparent

Building Blocks

Megasoid x Blingmod @ Mutek 2008, photo: fairlyawesome


To start off, I figure I should let you all know where I'm at, where I hope to go, and what I'm working with for my live av/pa set.  


I've been performing fairly regularly for the last two years; focused more on the musical side than the visual side.  Until last weekend (my test run at Super Aqua Club), I had never performed both visuals and music together, nor both off of the same laptop.


For my music set, I've always used Ableton Live.  My set has essentially consisted of triggering audio loops from bounced stems of my songs and then I've experimented with a variety of live chopping/filtering/processing techniques.  This summer I plan to explore new processing techniques as well as start running live synthesizers and drum machines (software based) and experiment with live MIDI filtering and arpeggiation.


With my live video, I built from the ground up a somewhat extensive patch in Max/MSP/Jitter in 2007/2008.  I performed with this alongside my friends at our Turbo Crunk parties and was even honored to play the Mutek Festival last year with Megasoid opening for Modeselektor.  In the final version of my patch, I was taking live midi and control feeds from Megasoid's Ableton set as well as a chopper/filtering patch that they had written and were running live off of a Nord G2 Modular Synthesizer.  All of this I controlled from a patch that I built on my Lemur.


I hope to build upon ideas that came about during these performances, utiliizing more cross-modal connections between my musical and visual softwares in ways that effectively balance interest between both mediums.  For now, I am stepping away from using or developing a standalone visual environment in Max/MSP/Jitter and have been exploring the possibilities of (mostly) developed software such as Garagecube's Modul8 and Vidvox's VDMX5.  However, I still use Max as a routing/scaling/processing device for controller information between applications and my Lemur.


To give you an idea of what I'm working with, here is my current performance hardware and software that I am using and plan to use this summer as I develop this project...


Hardware:

-Apple Macbook Pro

-Presonus Firebox

-Jazzmutant Lemur

-Faderfox LV2 (soon to be replaced by...)

-Akai APC 40 (next week!)

-G-Tech Mini FW800 Hard Drive


Software:

-Ableton Live 8

-Garagecube Modul8

-Vidvox VDMX5

-Cycling 74 Max/MSP/Jitter




Monday, May 18, 2009

Welcome to Blingmod A/V.

I've created this blog to document my progress as I work on building a new live performance set this summer for my Bearmod/Blingmod projects. In this undertaking, I am attempting to create a platform to combine live beat/song generation (using Ableton Live) along with live visual/video generation (using tools to be determined).

This journey is already well under way and the most useful resources I have found come from others who have been attempting similar A/V journeys and have graciously posted tips, tricks, tutorials, patches, templates, etc. on the internet for free. With this blog I hope to join this wonderful open source community by sharing the discoveries that I have been making and will continue to make along the way.

Ultimately, I hope this blog will serve to inspire and help others in creating and/or improving their own A/V performances.

Please feel free to comment on posts with questions, criticisms, recommendations, etc... All feedback is welcome!

Thanks!

-BM