Introduction: Capturing Emotions in 3D Printed Objects

In order to explore ways to create objects that can communicate emotive character, I've created a technique for converting eeg signals into 3D-printable patterns and printed some baskets with patterns.

This past year I've been exploring ways to create an affect-based computation tool. Through these projects, I've learned that emotions can be measured in various ways.

In this project, I am using the DEAP dataset (https://www.eecs.qmul.ac.uk/mmv/datasets/deap/), a multimodal dataset for the analysis of human affective states.

In this Instructable, I'll demonstrate how I analyzed eeg signals and converted them into a pattern for suface morph. I've used Matlab for organizing data into text files and put them as input files for Grasshopper Python.

Step 1: EEG Signal Processing

Among the eeg signals, I have used signals from electrodes which have high correlation with arousal and valence for each participant. For arousal, I have chosen electrode Cz and for valence, I have chosen Oz. Since the data was segmented into 60 second trials and a 3 second pre-trial baseline removed, I averaged the power of signals for every 3 seconds and removed the first 3 seconds, which left me a total of 20 time-sampled values for Cz and Oz. Then they were normalized by the total power and multiplied to the participant's subjective ratings of arousal and valence. This gave me an input list of arousal and valence values.

Step 2: Use GH Python Remote to Create Curves

First, I had to use Grasshopper Python Remote to use some functions that are not available in IronPython. Then, I have used Bezier curves as presented in emotional line: https://link.springer.com/chapter/10.1007/978-3-3... Arousal values are used to control the height and valence values are used to control the control points of Bezier curves.

The following is a code for generating curves based on arousal and valence values.

import rhinoscriptsyntax as rs
import scriptcontext as sc
import Rhino
import Rhino.Geometry as rg
import math
import GhPython
numpy = sc.sticky['numpy'] 

sc.doc = Rhino.RhinoDoc.ActiveDoc

""" create a list of points """
pt = []
pt2 = []
pt3 = [] 

""" function to calculate distance between two points"""
def dist2(p0, p1):
  return (p0.X- p1.X)*(p0.X- p1.X) + (p0.Y- p1.Y)*(p0.Y- p1.Y)+(p0.Z- p1.Z)*(p0.Z- p1.Z)
  
def linterp(p1,p2,t):
    #tx = (1 - t) * p1[0] + t * p2[0]
    #ty = (1 - t) * p1[1] + t * p2[1]
    tx = p1[0] + (p2[0] - p1[0]) * t
    ty = p1[1] + (p2[1] - p1[1]) * t
    return (tx,ty,0) 
    
def bezier(p1, p2, p3, p4):  
    pt2 = [] 
    t = 0
    while t <= 1:
        t1 = linterp(p1,p2,t)
        t2 = linterp(p2,p3,t)
        t3 = linterp(p3,p4,t)
        t4 = linterp(t1,t2,t)
        t5 = linterp(t2,t3,t)
        t6 = linterp(t4,t5,t)
        #pt2.append(rs.AddPoint(t6)) 
        pt2.append(rs.CreatePoint(t6)) 
        t = t + 0.01
    return pt2


dx = (p2.X-p1.X)/(rx-1)
ry = int(0.5+(p2.Y-p1.Y)/dx)
sc.doc = GhPython.DocReplacement.GrasshopperDocument() 

arousal_f = "C:\\Users\\junga\\Downloads\\Human-Emotion-Analysis-using-EEG-from-DEAP-dataset-master\\Human-Emotion-Analysis-using-EEG-from-DEAP-dataset-master\\Arousal_32.txt"
valence_f = "C:\\Users\\junga\\Downloads\\Human-Emotion-Analysis-using-EEG-from-DEAP-dataset-master\\Human-Emotion-Analysis-using-EEG-from-DEAP-dataset-master\\Valence_16.txt"

arousal = numpy.loadtxt(arousal_f, delimiter="\n", unpack=False)
valence = numpy.loadtxt(valence_f, delimiter="\n", unpack=False)
sc.doc = Rhino.RhinoDoc.ActiveDoc


for i in range(0,ry):
    for j in range(0,rx+1):
        x = p1.X + j * dx
        y = p1.Y + i * dx
        if((i*j) % mod) == 0: 
                pt.append(rs.CreatePoint(x,y,0.0))

for j in range(0,int((rx/2))): 
    a = arousal[j + 1] 
    v = valence[j + 1]
    idx = 2*j       
    
    diff = (pt[idx+1].X - pt[idx].X)   
    p1 = (pt[idx].X, 0, 0)
    
    p2 = (pt[idx+1].X - int(diff*v/9), int(20*a/9), 0) 
    p3 = (pt[idx].X + int(diff*v/9), int(20*a/9), 0) 

    p4 = (pt[idx+1].X, 0, 0) 
    
    pt_m = []
    pt_m = bezier(p1, p2, p3, p4)
    for i in range(0,len(pt_m)):
        pt2.append(rs.CreatePoint(pt_m[i].X, pt_m[i].Y + 20, pt_m[i].Z)) 
    
    p1 = (pt[idx+1].X, 0, 0)
    
    p2 = (pt[idx+2].X - int(diff*v/9) , int(20*a/9), 0)
    p3 = (pt[idx+1].X + int(diff*v/9), int(20*a/9), 0)
    p4 = (pt[idx+2].X, 0, 0)
    
    pt_m = []
    pt_m = bezier(p1, p2, p3, p4)
    for i in range(0,len(pt_m)):
        pt2.append(rs.CreatePoint(pt_m[i].X, -pt_m[i].Y + 20, pt_m[i].Z))  
  
ln = []
length = len(pt2)  

ln.append(rs.AddCurve(pt2))

for j in range(0,int((rx/2))): 
    a = arousal[j + 11] 
    v = valence[j + 11]
    idx = 2*j                  
    p1 = (pt[idx].X, 0, 0)
    p2 = (pt[idx+1].X - int(diff*v/9), int(20*a/9), 0) 
    p3 = (pt[idx].X + int(diff*v/9), int(20*a/9), 0) 

    p4 = (pt[idx+1].X, 0, 0) 
    
    pt_m = []
    pt_m = bezier(p1, p2, p3, p4)
    for i in range(0,len(pt_m)):
        pt3.append(rs.CreatePoint(pt_m[i].X, pt_m[i].Y + 40, pt_m[i].Z)) 
    
    p1 = (pt[idx+1].X, 0, 0)   
    p2 = (pt[idx+2].X - int(diff*v/9) , int(20*a/9), 0)
    p3 = (pt[idx+1].X + int(diff*v/9), int(20*a/9), 0)
    p4 = (pt[idx+2].X, 0, 0)
    
    pt_m = []
    pt_m = bezier(p1, p2, p3, p4)
    for i in range(0,len(pt_m)):
        pt3.append(rs.CreatePoint(pt_m[i].X, -pt_m[i].Y + 40, pt_m[i].Z))   

ln.append(rs.AddCurve(pt3))


a = pt

""" create a list of lines """

d2 = dist*dist

b = ln

Step 3: Surface Morph and Solid Union

The surface morph was done with the patterns that I have created in the above step. Notice that curve shape changes depending on input arousal and valence values. The surface to map onto was created using 'RevSrf' method that we learned in the class. After the presentation, I used 'Mesh Union' component to get the solid union of a basket with patterns on top of it as the professor recommended me.

Step 4: First Test

This is a test print of small baskets. I will keep post more results.

Step 5: Attempt to Create Shapes Based on Arousal and Valence (Future Work)

From the paper written by Ibáñez and Delgado-Mata, I have learned that the same visual modality (sharpness of curvature) can be used to express different dimensions of emotions (arousal and valence) depending on the modalities it is combined with.

(https://www.sciencedirect.com/science/article/pii/...)

I used the concatenation of Bezier curves that they used to create shapes with different arousal and valence values. The arousal value was used to calculate the height of each curve segments and the valence value was used to calculate sharpness of curvature. Then the curve was lofted to create a volume. I have tried the surface morph onto this shape, but morphing was not done as I wanted. (Patterns cannot be seen in the figure on the right)

I hope to continue this work to do surface morphing onto shapes created using Bezier curves while preserving the patterns of emotional line.

for j in range(0,divis):
			angle = e_angle * j
			new_x = radius*math.cos(angle)
			new_y = radius*math.sin(angle)  

			scaling = scaling_factor*radius  

			p1 = (scaling*math.cos(e_angle/2), scaling*math.sin(e_angle/2), 0)
			diff = math.sin(e_angle/2) - math.sin(-e_angle/2) 
			valence_prop = diff * valence_pro
			p2 = (scaling*math.cos(e_angle/2) + arousal_pro, scaling*(math.sin(-e_angle/2) + valence_prop), 0) 
			p3 = (scaling*math.cos(-e_angle/2) + arousal_pro, scaling*(math.sin(e_angle/2) - valence_prop), 0)
			p4 = (scaling*math.cos(-e_angle/2), scaling*math.sin(-e_angle/2), 0)

			pt_m = []
			pt_m = self.bezier(p1, p2, p3, p4)


			for i in range(0,len(pt_m)):
				# rotation 
				xprime = pt_m[i].X * math.cos(-angle) - pt_m[i].Y * math.sin(-angle)  
				yprime = pt_m[i].X * math.sin(-angle) + pt_m[i].Y * math.cos(-angle) 
				pt2.append(rs.CreatePoint(xprime, yprime, pt_m[i].Z))<br>