This video shows how to make a Virtual Instrument for a simple instrument - a Kalimba (otherwise known under many names - such as mblra....) Typically these seme to have about 17 notes. I wonder if I should buy one - they're not very expensive.
https://youtu.be/B2iEhJcrHxI Making the VI
The video explanation follows a few basic principles:
1. Record the samples - here 5 at a time - and for some reason 2 microphones were used - so I think actually 10 samples - or were those used for stereo? Also record the "silence" to pick up noise.
2. Remove the noise in the recorded tracks. Here the tool used was RX Elements. I suspect that the noise reduction in Audacity would do almost as good a job - or maybe even as good or better.
3. Import the tracks into Reaper, and then find the transient starts for each sample. This is a technique which can be used in other DAWs too - some of which also do this automatically.
4. Then split the tracks at the transient starts - and then back off each region slightly - which as mentioned should do every region by exactly the same small amount - so the each separate region includes a very short period of time before the transient.
5. The next phase is to import all the clips into a sampler - which I assume here is Kontakt. The tracks are all given systematic names, ready for auto mapping, which associates each clip with a specific note on the keyboard. For this particular instrument, the clips were only done at a generic dynamic level, as a decision was made that the notes on the instrument don't change much if done at different dynamics - which is not the case for some other instruments.
6. The clip names are all associated with pitches and given appropriate names, and then an automap function is used to mape them.
7. There were then some additional tweaks, such as refining the tuning for each sample, and also the volume levels.
The overall process can be followed in other DAWS, and using other sampling synthesisers, such as Alchemy or EXS24 in Logic.
More complex instruments may respond differently depending on dynamics or articulation. Those can be modelled by associating different clip samples with velocity in Midi, and the automation features in a DAW. That would then give a slightly (possibly significantly) different sound for each note, depending on how hard each key on the keyboard is struck.
Other refinements are to shape the decay at the end of each sample - which would be particularly appropriate for the kalimba instrument as it isn't an instrument which can produce a sustained sound. Instruments (e.g. violins) which can produce a sustained note, and also have other articulations - staccato, spiccato and plucked - are likely to require a more complex approach - either with recording different samples, or simulating the articulation effects by altering the attack and decay on each sample in the sampler.
Of course the VI doesn't have to be realistic. In the video initially there was an extended range on the first implementations of the VI, which went way outside the range of the kalimba. If the sounds are interesting enough, then why not keep them?
https://youtu.be/B2iEhJcrHxI Making the VI
The video explanation follows a few basic principles:
1. Record the samples - here 5 at a time - and for some reason 2 microphones were used - so I think actually 10 samples - or were those used for stereo? Also record the "silence" to pick up noise.
2. Remove the noise in the recorded tracks. Here the tool used was RX Elements. I suspect that the noise reduction in Audacity would do almost as good a job - or maybe even as good or better.
3. Import the tracks into Reaper, and then find the transient starts for each sample. This is a technique which can be used in other DAWs too - some of which also do this automatically.
4. Then split the tracks at the transient starts - and then back off each region slightly - which as mentioned should do every region by exactly the same small amount - so the each separate region includes a very short period of time before the transient.
5. The next phase is to import all the clips into a sampler - which I assume here is Kontakt. The tracks are all given systematic names, ready for auto mapping, which associates each clip with a specific note on the keyboard. For this particular instrument, the clips were only done at a generic dynamic level, as a decision was made that the notes on the instrument don't change much if done at different dynamics - which is not the case for some other instruments.
6. The clip names are all associated with pitches and given appropriate names, and then an automap function is used to mape them.
7. There were then some additional tweaks, such as refining the tuning for each sample, and also the volume levels.
The overall process can be followed in other DAWS, and using other sampling synthesisers, such as Alchemy or EXS24 in Logic.
More complex instruments may respond differently depending on dynamics or articulation. Those can be modelled by associating different clip samples with velocity in Midi, and the automation features in a DAW. That would then give a slightly (possibly significantly) different sound for each note, depending on how hard each key on the keyboard is struck.
Other refinements are to shape the decay at the end of each sample - which would be particularly appropriate for the kalimba instrument as it isn't an instrument which can produce a sustained sound. Instruments (e.g. violins) which can produce a sustained note, and also have other articulations - staccato, spiccato and plucked - are likely to require a more complex approach - either with recording different samples, or simulating the articulation effects by altering the attack and decay on each sample in the sampler.
Of course the VI doesn't have to be realistic. In the video initially there was an extended range on the first implementations of the VI, which went way outside the range of the kalimba. If the sounds are interesting enough, then why not keep them?
Comment