# Connexions

You are here: Home » Content » Media Representation in Processing

### Recently Viewed

This feature requires Javascript to be enabled.

# Media Representation in Processing

Module by: Davide Rocchesso, Pietro Polotti. E-mail the authors

Summary: Introduction on how images and sounds are represented (including colors and coordinates) in processing.

## Visual Elements

### Coordinates

In Processing, the representation of graphic objects is based on a cartesian 3D coordinate system, as displayed in Figure 1.

2D images are processed by acting on the X-Y plane, thus assuming that the Z coordinate is zero. The function size() defines the display window size and the rendering engine that will be used to paint onto the window. The default engine is JAVA2D, the 2D graphic Java libray. A bidimensional rendering engine, especially suitable for faster pixel-based image processing, is P2D (Processing 2D). If one wants to program in 3D, he must choose either the P3D (Processing 3D) rendering engine, especially suited for web-oriented graphics, or OPENGL, which delegates many typical 3D operations to the graphic board thus freeing the CPU from many computations. Moreover, if the objective is high-quality printing with vector graphics, a PDF rendering option is available.

### Images

In Processing, an image can be assigned to an object of the class PImage. The function loadImage("myImage") takes a file (gif or jpg) myImage, containing the pixel coding of an image, and gives back the content of such image, which can be assigned to a variable of type PImage. The file myImage must be loaded in the data folder of the directory having the same name as the Processing sketch we are working at.

#### Note:

When the New command is executed, processing opens up a folder named sketch_??????? within the Processing directory, corresponding to the name assigned bye the system to the newly created file. Such folder is accessible from the Processing menu item Sketch/Add File.
The class PImage gives access, by the fields width and height, to the width and height of the loaded image. The image content is accessed via the pixels[] field.


size(400,300);
PImage b;
println("width=" + b.width + " height=" + b.height);
image(b, 0, 0, 400, 300); // position (0,0); width=400; height=300;
image(b, 20, 10, 100, 80); // position (20,10); width=100; height=80;


### Colors

Since our color receptors (cones), each tuned to a wavelength region, are of three kinds, color models are always referred to a three-dimensional space. In additive color models, each of three axes correspond to a base color, and by mixing three colored light beams one can obtain all colors within a gamut volume in the space defined by the three axes. The three base colors can be chosen arbitrarily or, more often, based on the application domain (e.g., color of three phosphors or laser beams). In printing processes, subtractive color models are used, where the starting point is the white surface and primary ink colors are used to subtract color from white.

#### Note:

In processing color is a primitive type used to specify colors. It is realized by a 32-bit number, where the first byte specifies the alpha value, and the other bytes specify a triple either in the RGB or in the HSB model. The choice of one model or the other is made by the colorMode() function. With three bytes, a number of 256×256×256=16777216 256 256 256 16777216 are representable.

#### The RGB model

Colors are represented by a triple of numbers, each giving the intensity of the primary colors Red, Green, and Blue. Each number can be an unsigned integer, thus taking values between 0 and 255, or be expressed as a floating point number between 0.0 0.0 and 1.0 1.0. With even larger flexibility, the model, type, and range of colors can be set with the function colorMode(). The RGB model is additive.

#### HSB Model

Colors are represented by a triple of numbers, the first number giving the Hue, the second giving Saturation, and the third giving the Brightness.

##### Note:
Often the model is called HSV, where V stands for Value.
The hue takes values in degrees between 0 0 (red) and 360 360, being the various hues arranged along a circumference and being red positioned at 0 ˚ 0˚ . Saturation and brightness vary between 0 0 and 100 100. The saturation is the degree of purity of color. If a pure color is added with a white light its degree of purity decreases until the color eventually sits on a gray scale when saturation is zero. In physical terms, the brightness is proportional to the signal power spectrum. Intuitively, the brightness is increased when the light intensity increases. The three-dimensional HSB space is well represented by a cylinder, with the hue (nominal scale) arranged along the circumference, the saturation (ratio scale) arranged along the radius, and the brightness (interval scale) arranged along the longitudinal axis. Alternatively, such three-dimensional space can be collapsed into two dimensions, as in the color chooser of the image-processing program Gimp, displayed in Figure 2. Along the circumference, the three primary colors (red, green, and blue) are visible, 120 ˚ 120˚ apart from each other, separated from the secondary colors (magenta, cyan, yellow). Each secondary color is complementary to the primary color in front of it in the circumference. For instance, if we take the green component out of a white light, we obtain a magenta light. The triangle inscribed in the circumference has a vertex pointing to a selected hue. The opposite side contains the gray scale, thus representing colors with null saturation and variable brightness. Going from the reference vertex to the opposite side we have a gradual decrease in saturation.

#### Alpha channel

It is a byte used to blend and interpolate between images, for example to render transparency. It can be obtained, from a variable of type color, with the method alpha(). The alpha channel can be manipulated with the method blend() of the class PImage.

  size(400,300); PImage b = loadImage("gondoliers.jpg"); PImage a = loadImage("gondoliers.jpg"); float ramp = 0; for (int j = 0; j < b.height; j++) for (int i = 0; i < b.width; i++) { b.set(i, j, b.get(i,j) + color(0,0,0, 255 - (int)((1-ramp)*255)) ); ramp = ramp + 1/(float)(b.width * b.height); } a.blend(b, 0, 0, b.width, b.height, 80, 10, 450, 250, BLEND); image(a, 0, 0, 400, 300); 

In Processing, it is possible to assign a color to a variable of type color by means of the function color(), and the model can be previously set with colorMode(). The functions red(), green(), blue(), hue(), saturation(), and brightness() allow to move from one model to the other.


colorMode(RGB);
color c1 = color(102, 30,29);
colorMode(HSB);
color c2 = color(hue(c1), saturation(c1), brightness(c1));
colorMode(RGB);
color c3 = color(red(c2), green(c2), blue(c2));
// the variables c1, c2, and c3 contain the coding of the same color


#### Tinging an image

An image can be tinged with a color and its transparency can be set by assigning a given value to the alpha channel. For this purpose, the function tint() can be used. For example, a blue tone can be assigned to the inlaid image of Example 1 by just preceding the second image() command with tint(0, 153, 204, 126) .

### Translations, Rotations, and Scale Transformations

#### Representing Points and Vectors

In computer graphics, points and vectors are represented with the

Definition 1: homogeneous coordinates
quadruples of numbers, where the first triple is to be read in the X-Y-Z space, while the fourth number indicates a vector if it takes value 0, or a point if it takes value 1.
A translation is obtained by adding, in homogeneous coordinates, a vector to a point, and the result is a point. Alternatively we ca see a translation as a matrix-vector product (see Matrix Arithmetic), where the matrix is ( 1 0 0 t x 0 1 0 t y 0 0 1 t z 0 0 0 1 ) 1 0 0 t x 0 1 0 t y 0 0 1 t z 0 0 0 1 , and the vector is the one representing the point xyz1 x y z 1 . An anti-clockwise rotation by the angle θ θ around the axis z z (roll), is obtained by the rotation matrix ( cosθsinθ0 0 sinθcosθ0 0 0 0 1 0 0 0 0 1 ) θ θ 0 0 θ θ 0 0 0 0 1 0 0 0 0 1 . Rotations around the axes x x (pitch) and y y (yaw) are realized by means of rotation matrices of the same kind, and a rotation around an arbitrary axis can be obtained by composition (left multiply) of elementary rotations around each of the main axes.

#### Translations

The function translate() moves an object in the image window. It takes two or three parameters, being the displacements along the directions x x, y y (and z z), respectively.

#### Rotations

In two dimensions, the function rotate() is used to rotate objects in the image window. This is obtained by (left) multiplying the coordinates of each pixel of the object by a rotation matrix. Rotations are always specified around the top left corner of the window ( 00 0 0 coordinate). Translations can be used to move the rotation axis to other points. Rotation angles are specified in radians. Recall that 2 π rad = 360 ˚ 2 rad 360 ˚ . For example, insert the rotation rotate(PI/3) before the second image() command in Example 1. In three dimensions, we can use elementary rotations around the coordinate axes rotateX(), rotateY(), e rotateZ().

#### Scale Transformations

The function scale() allows to expand or contract an object by multiplication of its point coordinates by a constant. When it is invoked with two or three parameters, different scalings can be applied to the three axes.

### Typographic Elements

Every tool or language for media manipulation gives the opportunity to work with written words and with their fundamental visual elements: typographic characters.

The aspect of a type has two main components: font and size.

Processing has the class PFont and the methods loadFont() (to load a font and assign it to an object of the PFont class) and textFont() (to activate a font with a specific size). In order to load a font, this has to be pre-loaded into the directory data of the current sketch. The tool Create Font, accessible from the Tools menu in Processing, creates the bitmaps of the characters that the programmer intends to use. The file with the bitmaps is put in the data directory. After these preliminary operations, the font can be used to write some text, using the function text(). With this function, a string of characters can be put in the 2D or 3D space, possibly inserting it within a rectangular box. The alignment of characters in the box is governed by the function textAlign(). In the default configuration, the written text can be spatially transformed like any other object. The color of characters can be set with the usual fill(), like for any other graphic object.

#### Example 3: Overlapped text

  PFont fonte; /*The font have been previously created in the data folder*/ fonte = loadFont("HoeflerText-Black-48.vlw"); textFont(fonte, 12); fill(10, 20, 250, 80); textAlign(RIGHT); text("pippo pippo non lo sa", 10, 14, 35, 70); textFont(fonte, 24); fill(200, 0, 0, 100); text("ppnls", 25, 5, 50, 90); 

Processing allows a tight control of the spatial occupation of characters and of the distance between contiguous characters (see Figure 3). The function textWidth() computes the horizontal extension of a character or a string. It can be used, together with the exact coordinates passed to text(), to control the kerning and the tracking between characters. The textSize() allows to redefine the size of characters. The textLeading() re-defines the distance in pixels between adjacent text lines. This distance is measured between the baselines of the strings of characters. Letters such as "p" or "q" extend below the baseline for a number of pixels that can be obtained with the textDescent(). Instead, the textAscent() gives back the maximum extension above the baseline (typically, the height of the letter "d").

## Auditory Elements

### Sounds

Untill version beta 112, Processing gave the possibility to program several audio functionalities by means of some core primitives. In those older versions only two basic primitives are available to playback and load .wav files. In more recent versions, Processing delegate sound management and processing functionalities to external libraries. The most used libraries are Ess, Sonia, and Minim. Only the latter is included in the base installation of Processing. Ess and Sonia need an explicit installation process. Recently, a well-structured and documented Java library called Beads has also been introduced. It is well suited to the construction of audio-processing algorithms based on chains of base objects. As in the case of images, in order to process and playback sounds the source files have to be stored in the data folder of the current sketch. The library Sonia is the most complex one. With its functions, one can do sample playback, realtime Fourier-based spectral analysis, .wav file saving. In order to use the Sonia library, the programmer has to download the .zip file from Sonia. Once decompressed, the directory Sonia_?_? has to be copied into the directory Processing/libraries. Finally, the command import has to be inserted into the code by selecting it from the menu item Sketch / Import Library / Sonia_?_?.

#### Note:

In order to run the applets produced with Sonia from a web browser, the Phil Burk's JSyn plugin has to be downloaded and installed from the site http://www.softsynth.com/jsyn/plugins/.
The library Minim, based on Java Sound, is more user-friendly, well-documented and recommended, if one wants to work with sounds employing high-level primitives, without dealing with low-level numerical details and buffer management.

### Timbre

In this section, we first use then analyze an application for the exploration of timbres, similar in conception to the Color Chooser of Figure 2, and here called Sound Chooser. For the moment, let us think about a sound timbre in analogy with color in images. For example, the various instruments of the orchestra have different and characterizing timbres (colors). Later on, we will define the physical and perceptual aspects of timbre more accurately. In the Sound Chooser applet, four sounds with different timbres can be played by clicking onto one of the marked radii. Each radius corresponds to a musical instrument (timbre/color). By changing position along the radius it is possible to hear how the brightness is changed. More precisely, as long as we proceed toward the centre, the sounds gets poorer.

Let us analyze the Processing code that implements the Sound Chooser in its salient aspects. The Sonia.start(this) command is necessary to activate the Sonia audio engine. The line Sample mySample1 declares a variable aimed at containing audio samples. Several methods can be applied to such variable. Among these, the play methods plays the sound sample back. In the draw() code section the graphic aspect of the applet is defined. Finally, by the function mouseReleased(), we detect when the mouse is released after being pressed, and where it has been released. At this point a sequenceo of if conditions finds what instrument/timbre has been selected according to the clicking point. Moreover, within the function mouseReleased() the function filtra(float[] DATAF, float[] DATA, float RO, float WC) is invoked. This function, which is implemented in the last segment of the code listing, performs a sound filtering. More precisely, it is a low-pass filter, thus a filter that leaves the low frequencies unaltered and reduces the intensity of the high frequencies. According to the radial position of the mouse click, the filtering effect changes, being more dramatic (that is the sound becomes darker) as the mouse is released closer and closer to the centre. A lighter realization of the Sound Chooser by means of the library Minim is proposed in problem Exercise 4. The problem Exercise 5 explores the recent library Beads.

Table 3
 Trumpet Oboe Violin Flute Applet: choosing a timbre and controlling its brightness


import pitaru.sonia_v2_9.*;

Sample mySample1, mySample2, mySample3, mySample4;
Sample mySample1F, mySample2F, mySample3F, mySample4F;

float[] data1, data2, data3, data4;
float[] data1F, data2F, data3F, data4F;

int sr = 11025;  // sampling rate

void setup()
{
size(200, 200);
colorMode(HSB, 360, height, height);
Sonia.start(this);

mySample1 = new Sample("flauto.aif");
mySample2 = new Sample("oboe.wav");
mySample3 = new Sample("tromba.wav");
mySample4 = new Sample("violino.wav");

mySample1F = new Sample("flauto.aif");
// ... OMISSIS ...

data1  = new float[mySample1.getNumFrames()];
// creates new arrays the length of the sample
// for the original sound
// ... OMISSIS ...

data1F  = new float[mySample1.getNumFrames()];
// creates new arrays the length of the sample
// for the filtered sound
// ... OMISSIS ...

// ... OMISSIS ...

}

void draw()
{
// ... OMISSIS ...
}

void mouseReleased()
{

float ro;
float roLin;
float wc;

// FLAUTO
if ((mouseX > 95) && (mouseX < 105)&& (mouseY > 50)&& (mouseY < 90)) {

roLin = (mouseY-49.99)/41;
ro = pow(roLin,.33);
wc = 298*(TWO_PI/sr);
filtra(data1F, data1, wc, ro);

mySample1F.write(data1F);
mySample1F.play();
}
// ... OMISSIS ...

}

//filtra = new function
void filtra(float[] DATAF, float[] DATA, float WC, float RO) {

float G;
float RO2;
RO2 = pow(RO, 2);
G = (1-RO)*sqrt(1-2*RO*cos(2*WC)+RO2)*4; // (*4) is for having it louder

for(int i = 3; i < DATA.length; i++){
DATAF[i] = G*DATA[i]+2*RO*cos(WC)*DATAF[i-1]-RO2*DATAF[i-2];
//recursive filtering
}
}

// safely stop the Sonia engine upon shutdown.
public void stop(){
Sonia.stop();
super.stop();
}



### Exercise 1

The content of a PImage object is accessible through its pixels[] field. The pixels, corresponding to a row-by-row reading of the image, are contained in this array of size width*height. Modify the code in Example 2 to use the field pixels[] instead of the method get(). The final outcome should remain the same.

#### Solution

The invocation b.set() should be replaced by


b.set(i,j,b.pixels[j*b.width+i]+ color(0,0,0, 255 - (int)((1-ramp)*255)) );


### Exercise 2

Complete the code reported in Table 3 to obtain the complete Sound Chooser applet.

### Exercise 3

Add some color to the radii of the Sound Chooser, by replacing the line instructions with rect instructions and coloring the bars with a brightness that increases goint from the centre to the periphery.

### Exercise 4

Produce a new version of the Sound Chooser of problem Exercise 2 employing the library Minim. Note the gained compact form and simplicity of the code.

#### Solution

import ddf.minim.*;
import ddf.minim.effects.*;

Minim minim;
AudioPlayer mySample1, mySample2, mySample3, mySample4;
LowPassSP lpf1, lpf2, lpf3, lpf4;
float cutoff1, cutoff2, cutoff3, cutoff4;

void setup()
{
size(200, 200);
colorMode(HSB, 360, height, height);
minim = new Minim(this);

lpf1 = new LowPassSP(4000, mySample1.sampleRate());
lpf2 = new LowPassSP(4000, mySample2.sampleRate());
lpf3 = new LowPassSP(4000, mySample3.sampleRate());
lpf4 = new LowPassSP(4000, mySample4.sampleRate());

}

void draw()
{
stroke(255);
strokeWeight(1);
fill(0, 88, 88);
ellipseMode(CORNER);
ellipse(50,50,100,100);

beginShape(LINES);
vertex(50, 100);
vertex(90, 100);

vertex(110, 100);
vertex(150, 100);

vertex(100, 50);
vertex(100, 90);

vertex(100, 110);
vertex(100, 150);
endShape();
}

void mouseReleased()
{

// FLUTE
if ((mouseX > 95) && (mouseX < 105)&& (mouseY > 50)&& (mouseY < 90)) {
cutoff1 = map(mouseY, 50, 90, 1000, 30);
lpf1.setFreq(cutoff1);
println(mouseY + " +  " +cutoff1);
mySample1.rewind();
mySample1.play();
}

// OBOE
if ((mouseX > 110) && (mouseX < 149)&& (mouseY > 95)&& (mouseY < 105)) {
cutoff2 = map(mouseX, 110, 149, 30, 1000);
lpf2.setFreq(cutoff2);
println(mouseX + " +  " +cutoff2);
mySample2.rewind();
mySample2.play();
}

// TRUMPET
if ((mouseX > 95) && (mouseX < 105)&& (mouseY > 110)&& (mouseY < 149)) {
cutoff3 = map(mouseY, 110, 149, 30, 1000);
lpf3.setFreq(cutoff3);
println(mouseY + " +  " +cutoff3);
mySample3.rewind();
mySample3.play();
}

// VIOLIN
if ((mouseX > 50) && (mouseX < 90)&& (mouseY > 95)&& (mouseY < 105)) {
cutoff4 = map(mouseX, 50, 90, 1000, 30);
lpf4.setFreq(cutoff4);
println(mouseX + " +  " +cutoff4);
mySample4.rewind();
mySample4.play();
}
}

// safely stop the Minim engine upon shutdown.
public void stop(){
mySample1.close();
mySample2.close();
mySample3.close();
mySample4.close();
minim.stop();
super.stop();

}



### Exercise 5

Produce a new version of the Sound Chooser of problem Exercise 2 using the Beads library. The signal-processing flow is particularly readable from the resulting code.

#### Solution



AudioContext ac;

String sourceFile; //path to audio file
SamplePlayer mySample1, mySample2, mySample3, mySample4;
Gain g;
Glide cutoff1, cutoff2, cutoff3, cutoff4;
OnePoleFilter lpf1, lpf2, lpf3, lpf4;

void setup() {
size(200, 200);
colorMode(HSB, 360, height, height);

ac = new AudioContext();

sourceFile = sketchPath("") + "data/flauto.aif";
try {
mySample1 = new SamplePlayer(ac, new Sample(sourceFile));
}
catch (Exception e) {
println("Exception while attempting to load sample.");
e.printStackTrace(); // description of error
exit();
}
mySample1.setKillOnEnd(false);

sourceFile = sketchPath("") + "data/oboe.wav";
try {
mySample2 = new SamplePlayer(ac, new Sample(sourceFile));
}
catch (Exception e) {
println("Exception while attempting to load sample.");
e.printStackTrace(); // description of error
exit();
}
mySample2.setKillOnEnd(false);  sourceFile = sketchPath("") + "data/flauto.aif";

sourceFile = sketchPath("") + "data/tromba.wav";
try {
mySample3 = new SamplePlayer(ac, new Sample(sourceFile));
}
catch (Exception e) {
println("Exception while attempting to load sample.");
e.printStackTrace(); // description of error
exit();
}
mySample3.setKillOnEnd(false);  sourceFile = sketchPath("") + "data/flauto.aif";

sourceFile = sketchPath("") + "data/violino.wav";
try {
mySample4 = new SamplePlayer(ac, new Sample(sourceFile));
}
catch (Exception e) {
println("Exception while attempting to load sample.");
e.printStackTrace(); // description of error
exit();
}
mySample4.setKillOnEnd(false);

cutoff1 = new Glide(ac, 1000, 20);
lpf1 = new OnePoleFilter(ac, cutoff1);
cutoff2 = new Glide(ac, 1000, 20);
lpf2 = new OnePoleFilter(ac, cutoff2);
cutoff3 = new Glide(ac, 1000, 20);
lpf3 = new OnePoleFilter(ac, cutoff3);
cutoff4 = new Glide(ac, 1000, 20);
lpf4 = new OnePoleFilter(ac, cutoff4);

g = new Gain(ac, 1, 1);
ac.start();
background(0);
}

void draw()
{
stroke(255);
strokeWeight(1);
fill(0, 88, 88);
ellipseMode(CORNER);
ellipse(50,50,100,100);

beginShape(LINES);
vertex(50, 100);
vertex(90, 100);

vertex(110, 100);
vertex(150, 100);

vertex(100, 50);
vertex(100, 90);

vertex(100, 110);
vertex(100, 150);
endShape();
}

void mouseReleased(){
// FLAUTO
if ((mouseX > 95) && (mouseX < 105)&& (mouseY > 50)&& (mouseY < 90)) {
cutoff1.setValue(map(mouseY, 50, 90, 1000, 30));
mySample1.setToLoopStart();
mySample1.start();
}

// OBOE
if ((mouseX > 110) && (mouseX < 149)&& (mouseY > 95)&& (mouseY < 105)) {
cutoff2.setValue(map(mouseX, 110, 149, 30, 1000));
mySample2.setToLoopStart();
mySample2.start();
}

// TROMBA
if ((mouseX > 95) && (mouseX < 105)&& (mouseY > 110)&& (mouseY < 149)) {
cutoff3.setValue(map(mouseY, 110, 149, 30, 1000));
mySample3.setToLoopStart();
mySample3.start();
}

// VIOLINO
if ((mouseX > 50) && (mouseX < 90)&& (mouseY > 95)&& (mouseY < 105)) {
cutoff4.setValue(map(mouseX, 50, 90, 1000, 30));
mySample4.setToLoopStart();
mySample4.start();
}
}


### Exercise 6: Vectorial fonts

Processing programmers are encouraged to use bitmap fonts, encoded in a file with extension .vlw. This makes Processing independent from the fonts that are actually installed on a specific machine. However, it is possible to use vectorial fonts (e.g., TrueType) by inserting their files (e.g., with extension .ttf) in the Data folder. Try experimenting with vectorial fonts by using the createFont() function. If we give up the invariance of behavior on different machines, we can pass this function the name of a font that is installed on a specific computer and not found in the Data folder. Finally, under JAVA2D rendering mode, it is possible to use logical fonts, by passing Serif, SansSerif, Monospaced, Dialog, or DialogInput as a string that specifies the font as an argument of createFont(). Without the need of loading any font files in the Data folder, the correspondence between logical and physical fonts will be system dependent. Try experimenting with logical fonts on your computer.

#### Solution

This is an example of solution. Please make sure that the fonts used are present in your computer or in the Data folder.


size(200,200, JAVA2D);
PFont fonte;
fonte = loadFont("HoeflerText-Black-48.vlw"); // previously created and inserted in Data
textFont(fonte, 12);
fill(10, 20, 250, 80);
textAlign(RIGHT);
text("pippo pippo non lo sa", 10, 14,  35, 70);
textFont(fonte, 94);
textAlign(LEFT);
fill(200, 0, 0, 100);
text("ppnls", 25, 5, 150, 190);
fonte = createFont("Serif", 10, false); // Java logical font
textFont(fonte, 80);
fill(0, 200, 0, 170);
rotate(PI/6);
text("LO SO", 20, 20, 280, 280);
fonte = createFont("cmsy10", 10, true); // font installed in the system
textFont(fonte, 80);
fill(0, 20, 150, 170);
rotate(PI/12);
text("ECCO", 20, 20, 280, 280);
fonte = createFont("grunge.ttf", 10, true); // vectorial font in the Data folder
textFont(fonte, 80);
fill(100, 100, 0, 170);
rotate(-PI/6);
text("qui", 20, 20, 280, 280);


## Content actions

PDF | EPUB (?)

### What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks