Stuck with a SynthEdit project?

Tag: MIDI

Signal and Data Types


Patch leads and Plugs are colour coded depending on the signal type.
Plugs (AKA Pins) are the coloured ‘spots’ on the modules, and Patch leads are the cables we “draw” to connect the modules.

The text, and coloured spot on each module represents a plug that can be used to connect to other modules using patch cords.
A plug allows parameters of the module to be changed using the output of another module, such as a control, information to be transferred, control voltages, to be sent and received and
Input plugs are normally located on the left of a module, and output plugs are on the right of a module.
There are a few exceptions to this rule in the range of GUI modules
Plugs are colour coded depending on their signal type, as are the patch cords.

DSP versus GUI data.

This is the section that contains some of the controls for the DSP modules, and the interface/control panel itself, GUI handles all the key clicks, mouse events
So you can see what type a module is the DSP modules are grey, and the GUI modules are blue.
In general you are not able to connect GUI to DSP directly, but there are some modules that allow information to pass between GUI and DSP modules. GUI/DSP Converters are part Light Blue part Light Grey.

DSP or Digital Signal Processing.

This covers Oscillators, Filters, Modulators, Inverters, Level control, Voltage controlled amplifiers, and the controls such as knobs, lists, sliders, lamps and switches. These are the modules that generate, control, and process your sounds. Most controls send voltages to the modules, but some such a list entry, allow you to select one value out of a pre defined list in a module. There are some with values you can pre-set such as Boolean (0 or 1), Voltage,

DSP versus GUI Data.

DSP data cannot be directly connected to a GUI module due to the different ways of handling data. Synthedit will not allow the direct connection. The data speeds are higher for DSP reflecting the amount of data which must be processed, and GUI data rates are lower to allow more CPU priority to be given to the DSP signal handling.
You must use a special data conversion module to communicate between the two types of data, and never use this to process DSP data, the conversion should only be used to send signals from a GUI control to a DSP module to change it’s operating parameters. Never try and use GUI to DSP for modulation or other rapid changes.

Timing.

In Synthedit communications between DSP and GUI is meant to be accurate enough to handle controls and visual display items. There is no guarantee of precision accuracy.
This means that for for fast timing and data updates there is a risk that data will be skipped or mis-timed. For this reason (and others) you should never convert DSP data to GUI to use GUI modules to process DSP signals. Not ever.
DSP Sample rate data is at 44 kHz or more while GUI communication takes place at roughly 30 to 60 Hz!
In short do not try and use GUI modules to handle DSP data. It will fail.

Bit Depth

SynthEdit processes all signals internally at 32 bits floating point.
When SynthEdit write to files it outputs either 16 bit integer, or 32 bit floating point samples.

Sample Rate

Many software synths separate control signals and audio signals. Control signals e.g. LFO’s and Envelopes are processed at a low sample rate, this saves on CPU power, but results in sluggish envelopes and zipper noise (noticeable stepping clicks on fast envelopes, panning or volume fades). You also have the hassle of having conversion modules.
To maintain sound quality, SynthEdit uses the same high sample rate for all audio signals. As a result SynthEdit just sounds cleaner and more responsive than many other soft synths. SynthEdit supports any sample-rate.  The Oscillator waveforms are generated at run time to suit the sample-rate.

The types of data signals.

Blue:- Normal Audio or Control Voltage signal. DSP use only.
Audio, or control voltage signals. Voltage is essentially floating point data. It is used to send Audio from one DSP module to another, or to send control voltages from one DSP module to another to control the recipients behaviour.
DSP Floating Point and Volts pins will inter-connect. Connecting a Float output plug to a Volts input plug should cause no problems, because smoothing of the Float value will take place automatically. However it’s not good practice to connect a Volts output plug to a Float input plug, as many of the Float inputs on a DSP module are not intended for use with fast changing modulation values and making this type of connection can cause crackling and audio dropouts when modulating the value on the plug.


Note: Voltage plugs are always DSP Plugs.

—————————————————————————————————-

Light Blue:- Floating point values
A floating point number, is a positive or negative whole number with a decimal point. For example, 5.5, 0.25, and -103.342 are all floating point numbers, while 91, and 0 are not. Floating point numbers get their name from the way the decimal point can “float” to any position necessary within the number.
Note: With large numbers, there are times when maths results from floating point calculations are not 100% accurate.

—————————————————————————————————-

Red:- Text data.
The Text data type stores any kind of text data. It can contain both single-byte and multibyte characters that the locale supports. The term simple large object refers to an instance of a Text or Byte data type.
Text can be GUI or DSP.

—————————————————————————————————-

BLOB:- Binary Large OBject.
This is a more technical aspect of data in Synthedit, and is not often used except in passing large amounts of binary data in or between modules when using or creating samplers, or sample players.
In Synthedit it has a built in limit of 5MB. trying to pass more than 5MB of data won’t work that amount of data can’t be handled and will just be dropped (ignored) and not transmitted.
A “BLOB” is a common acronym for “Binary Large OBject”, which means it’s a data object holding a large amount of binary data. Some languages have native BLOB types, but C++ doesn’t. Never the less, creating a blob is simple enough – you just create an array of bytes. This is done by creating an array of characters. This might be confusing, though, as an array of characters has a special meaning in C++ – it’s also a string.
For more information you really need to read in depth C++ programming tutorials and documentation.

—————————————————————————————————-

Orange:- Integer/ Integer64.
The Integer data type stores whole numbers that range from -2,147,483,647 to 2,147,483,647 for 9 or 10 digits of precision.
Note:- The number 2,147,483,648 is a reserved value and cannot be used. An Integer value is stored as a signed binary integer and is typically used to store counts, quantities, and so on.

—————————————————————————————————-

Yellow:- MIDI data.
MIDI is an acronym that stands for Musical Instrument Digital Interface. It’s a way to connect devices that make and control sound — such as synthesizers, samplers, and computers — so that they can communicate with each other, using MIDI messages.
Synthedit takes the MIDI input from your chosen device, and converts it into control signals (voltages) that it’s modules can understand to control the modules actions.

—————————————————————————————————-

Green:- A list of values. For example, Waveform names. DSP Only. Usually connects to a drop-down list.

—————————————————————————————————-

Black:- Boolean (logic on/off). This (for those of you familiar with electronics) is like the system used by TTL and CMOS logic chips.

—————————————————————————————————-


NOTE: SynthEdit will not allow you to connect patch cords to plugs of a different signal type without using a converter except for Voltage and Float (But you should always use a data type converter).

MIDI 1 and MIDI 2 in SynthEdit.

MIDI 1 (Musical Instrument Digital Interface)
MIDI is a technical standard that describes a standard means of, communications, digital interface, and electrical connectors that connect a wide variety of electronic musical instruments, computers, and related audio devices for playing, editing, and recording music.

MIDI Channels

This is a subject that seems (for some people) to cause confusion.
A MIDI channel allows a specific device to receive its own set of MIDI data. So any MIDI Data sent on say channel 1 will only be received by a connected device such as a MIDI Synthesizer which is set to use Midi Ch1.
When a MIDI device is set to “All” then it will receive data from all the other interconnected devices.
This allows us to control separate devices from separate sources. So in your DAW you could have three different keyboards controlling three different VST synthesizers, and a control surface set up as channels 1,2,3 for the individual Synthesizers, and channel four as the control channel for a mixer.
A single MIDI cable can carry up to sixteen channels of MIDI data, each of which can be routed to a separate device. Each interaction with a key, button, knob or slider is converted into a MIDI event, which specifies musical instructions, such as a note’s pitch, timing and loudness. One common MIDI application is to play a MIDI keyboard or other controller and use it to trigger a digital sound module (which contains synthesized musical sounds) to generate sounds, which the audience hears produced by a keyboard amplifier. MIDI data can be transferred via MIDI or USB cable, or recorded to a sequencer or digital audio workstation to be edited or played back.
Many groups, (Tangerine Dream for one) and studios have embraced this technology, as it’s much easier to transport, nowhere near as heavy, and unlike analogue synthesizers always stays in tune, and is a lot cheaper than a stack of of hardware synthesizers. Which is easier to transport, maintain and set up; three laptops and sound interfaces feeding into a mixer, or three Moog Modulars?

Preserving a performance.
A file format that stores and exchanges the data is also defined.
The advantages of MIDI include small file size, ease of modification (there are many software MIDI editors) along with a wide choice of electronic instruments, synthesizers, software synthesizers, or digitally sampled sounds.
A MIDI recording of a performance on a keyboard could sound like a piano or other keyboard instrument; however, since MIDI records the messages and information about their notes and not the specific sounds, this recording could be changed to many other sounds, ranging from synthesized or sampled guitar or flute to full orchestra.

Ease of communication.
Before the development of MIDI, electronic musical instruments from different manufacturers could generally not communicate with each other. This meant that a musician could not, for example, plug a Roland keyboard into a Yamaha synthesizer module. With MIDI, any MIDI-compatible keyboard (or other controller device) can be connected to any other MIDI-compatible sequencer, sound module, drum machine, synthesizer, or computer, even if they are made by different manufacturers.

MIDI 2.0
SynthEdit MIDI pins can handle both MIDI 1.0 and MIDI 2.0 standards.
From Version 1.5 many SynthEdit modules accept either MIDI 1 or MIDI 2 but send MIDI 2.

About MIDI 2.
Back in 1983, musical instrument companies that were in fierce competition nonetheless banded together to create a specification to allow musical instruments to communicate with each other, and with computers. This was MIDI 1.0, the first universal Musical Instrument Digital Interface.
Nearly four decades on, we can see that MIDI was crafted so well that it has remained useful and relevant. Its ability to join computers and musical instruments has become an major part of live performances for recording, controlling mixers, programming Synthesizers, and even stage lighting.
Now, MIDI 2.0 is taking the technology even further, deliberately retaining backward compatibility with MIDI 1.0 equipment and software already in use.

Here’s why MIDI 2.0 is the biggest advance in music technology in decades:

MIDI 2.0 Means Two-way MIDI Conversations
MIDI 1.0 messages were unidirectional: from the transmitter to a receiver. MIDI 2.0 is bi-directional and changes MIDI from a monologue to a dialogue between computers and instruments.
For example, with the new MIDI-CI (Capability Inquiry) messages, MIDI 2.0 devices can talk to each other, and auto-configure themselves to work together. They can also exchange information on functionality, which is key to backward compatibility.
MIDI 2.0 software and equipment can “talk” to a device, and if it doesn’t support MIDI 2.0, and then it can simply switch to the old MIDI 1.0 protocol.

Higher Resolution, More Controllers and Better Timing
To deliver an even higher level of musical and artistic expressiveness, MIDI 2.0 re-imagines the role of performance controllers, which is the aspect of MIDI that converts human performance gestures to control signals computers can understand.
Controllers have become easier to use, and there are more of them: over 32,000 controllers, including controls for individual notes.
Enhanced, 32-bit resolution gives controls a smoother, continuous, “analogue” feel. Note-On options were added for articulation control and setting precise note pitch.
In addition to this, dynamic response (velocity) has been improved.
What’s more, major timing improvements in MIDI 2.0 can also apply to MIDI 1.0 devices in fact, some MIDI 1.0 gear can actually “retrofit” certain MIDI 2.0 features.

Profile Configuration
MIDI gear can now have Profiles that can dynamically configure a device for a particular user scenario.
If a control surface queries a device with a “mixer” Profile, then the controls will map to faders, pan-pots, and other mixer parameters.
But when connected with a “drawbar organ” Profile, that same control surface can map its controls to virtual drawbars and other keyboard parameters, or map to dimmers if the profile is a lighting controller. This saves enormously on setup time, improves workflow, and eliminates time consuming manual programming.

Property Exchange
Profiles set up an entire device, and Property Exchange messages provide specific, detailed information sharing.
These messages can discover, retrieve, and set many properties like preset names, individual parameter settings, and unique functionalities.
Essentially, everything one MIDI 2.0 device needs to know about the MIDI 2.0 device it’s connected to.
For example, your DAW or recording software could display everything you need to know about a synthesizer on-screen, effectively bringing hardware synthesizers up to the same level of programmability as their software counterparts.

Built for the Future.
Unlike MIDI 1.0, which was initially tied to a specific hardware implementation, the new Universal MIDI Packet format makes it easy to implement MIDI 2.0 on any digital transport (like USB or Ethernet). To enable future applications that we haven’t yet developed, there’s ample space still in the standard reserved for brand-new MIDI specifications and messages.

For more detailed information on (and it’s very detailed and complex) on MIDI 2 standards and protocols visit the MIDI organisation’s website
https://www.midi.org/midi-articles/details-about-midi-2-0-midi-ci-profiles-and-property-exchange

Converting MIDI 1 to MIDI 2
SynthEdit provides a MIDI converter module that can convert MIDI 1 to MIDI 2 and vice versa.
This is useful for maintaining compatibility with MIDI 1 only modules.
MIDI 2.0 is now the default MIDI standard, because MIDI 1, MIDI MPE, and Steinberg Note-Expression can all be converted losslessly to MIDI 2. However it’s not always possible to convert MIDI 2 to MIDI 1.
The SynthEdit SDK now provides helper classes that will convert MIDI for you.
This allows you to write your MIDI code without having to handle all the different types of MIDI.
Note: It’s recommend that you write your modules to use MIDI 2.
The SDK contains the ‘MIDI to Gate’ module that shows how to write a MIDI 2 module that also accepts MIDI 1 transparently.

You can intercept the MIDI signals anytime before it reaches the Patch Automator.
Note that the MIDI-CV also secretly sends it’s MIDI data there too.
By default the MIDI in SE 1.5 is Version 2.0. The MIDI-In module converts everything to MIDI V 2.0. You can send version 1.0 also, but SE’s own MIDI modules will tend to covert it back into Version 2.0 if they get a chance .

MIDI Messages and (basic) Standards.
MIDI messages are made up of 8-bit bytes that are transmitted serially at a rate of 31.25 Kbit/s. This rate was chosen because it is an exact division of 1 MHz, the operational speed of many early microprocessors.  The first bit of each word identifies whether the byte is a status byte or a data byte, and is followed by seven bits of information. A start bit and a stop bit are added to each byte for framing purposes, so a MIDI byte requires ten bits for transmission

A MIDI link can carry sixteen independent channels of information. The channels are numbered 1–16. A device can be configured to only listen to specific channels and to ignore the messages sent on other channels (omni off mode), or it can listen to all channels, effectively ignoring the channel address (omni on). An individual device may be monophonic (the start of a new note-on MIDI command implies the termination of the previous note), or polyphonic (multiple notes may be sounding at once, until the polyphony limit of the instrument is reached, or the notes reach the end of their decay envelope, or explicit note-off MIDI commands are received). Receiving devices can typically be set to all four combinations of omni off/on and mono/poly modes

A MIDI message is an instruction that controls some aspect of the receiving device. A MIDI message consists of a status byte, which indicates the type of the message, followed by up to two data bytes that contain the parameters. MIDI messages can be channel messages sent on only one of the 16 channels and monitored only by devices on that channel, or system messages that all devices receive. Each receiving device ignores data not relevant to its function.  There are five types of message: Channel Voice, Channel Mode, System Common, System Real-Time, and System Exclusive.

Channel Voice messages transmit real-time performance data over a single channel. Examples include “note-on” messages which contain a MIDI note number that specifies the note’s pitch, a velocity value that indicates how forcefully the note was played, and the channel number; “note-off” messages that end a note; program change messages that change a device’s patch; and control changes that allow adjustment of an instrument’s parameters. MIDI notes are numbered from 0 to 127 assigned to C−1 to G9. This corresponds to a range of 8.175799 to 12543.85 Hz (assuming equal temperament and 440 Hz A4) and extends beyond the 88 note piano range from A0 to C8.

System Exclusive messages
System Exclusive (SysEx) messages are a major reason for the flexibility and longevity of the MIDI standard. Manufacturers use them to create proprietary messages that control their equipment more thoroughly than standard MIDI messages could.  SysEx messages use the MIDI protocol to send information about the synthesizer’s parameters, rather than performance data such as which notes are being played and how loud. SysEx messages are addressed to a specific device in a system. Each manufacturer has a unique identifier that is included in its SysEx messages, which helps ensure that only the targeted device responds to the message, and that all others ignore it. Many instruments also include a SysEx ID setting, so a controller can address two devices of the same model independently. SysEx messages can include functionality beyond what the MIDI standard provides.

Time code
A sequencer can drive a MIDI system with its internal clock, but when a system contains multiple sequencers, they must synchronize to a common clock. MIDI Time Code (MTC), developed by Digidesign, implements SysEx messages that have been developed specifically for timing purposes, and is able to translate to and from the SMPTE time code standard. MIDI Clock is based on tempo, but SMPTE time code is based on frames per second, and is independent of tempo. MTC, like SMPTE code, includes position information, and can adjust itself if a timing pulse is lost. MIDI interfaces such as Mark of the Unicorn’s MIDI Timepiece can convert SMPTE code to MTC

More Info: https://en.wikipedia.org/wiki/MIDI