# SciPy Minimize Constrain Around Internal Parameter, Not Input

## I am restricted to a cell phone at the minute but I could use some help. Below is my best attempt at a minimal reproducible example, but I'm hoping it's not even necessary. import numpy as np import scipy as sp data1 = [1.0, 2.0] D1_0 = np.asarray(data1) data2 = [1.5, 2.5] D2_0 = np.asarray(data2) specs =[1.25, 2.25] S = np.asarray(specs) changes = [[0.1, 0.2], [0.4, 0.5], [0.6, 0.7]] C = np.asarray(changes) rows_C, cols_C = C.shape def get_ssd(D): ssd = np.sum((np.subtract(D, S))**2) return ssd def objective(x): D1 = np.copy(D1_0) D2 = np.copy(D2_0) for i in range(cols_C): for j in range(rows_C): D1[i] = D1[i] + x[j] + C[j][i] D2[i] = D2[i] + x[j] + C[j][i] minim = get_ssd(D1) + get_ssd(D2) return minim x0 = np.zeros(rows_C) bnds = sp.optimize.Bounds(-1.0, 1.0) res = sp.optimize.minimize(objective, x0, method='Powell', bounds=bnds) print(res.x) If you actually run this code, it's going to execute fine but it isn't really informative. My question is this: I want to incorporate a condition where D1 and D2 must be within some range for all i. I know how other optimization methods can put expression constraints on x but I need something like that for D1 and D2, not x. My only remotely good thought for achieving this is by putting all of this code inside another optimizer (or even just a for-loop) that changes the bounds on x until the criteria is met. This seems extremely inelegant to me. I feel like there must be a way to put constraints on variables internal to the objective function and not x. I just don't know how to do that. Thank you in advance.

I am restricted to a cell phone at the minute but I could use some help. Below is my best attempt at a minimal reproducible example, but I'm hoping it's not even necessary.

```
import numpy as np
import scipy as sp
data1 = [1.0, 2.0]
D1_0 = np.asarray(data1)
data2 = [1.5, 2.5]
D2_0 = np.asarray(data2)
specs =[1.25, 2.25]
S = np.asarray(specs)
changes = [[0.1, 0.2], [0.4, 0.5], [0.6, 0.7]]
C = np.asarray(changes)
rows_C, cols_C = C.shape
def get_ssd(D):
ssd = np.sum((np.subtract(D, S))**2)
return ssd
def objective(x):
D1 = np.copy(D1_0)
D2 = np.copy(D2_0)
for i in range(cols_C):
for j in range(rows_C):
D1[i] = D1[i] + x[j] + C[j][i]
D2[i] = D2[i] + x[j] + C[j][i]
minim = get_ssd(D1) + get_ssd(D2)
return minim
x0 = np.zeros(rows_C)
bnds = sp.optimize.Bounds(-1.0, 1.0)
res = sp.optimize.minimize(objective, x0, method='Powell', bounds=bnds)
print(res.x)
```

If you actually run this code, it's going to execute fine but it isn't really informative. My question is this: I want to incorporate a condition where `D1`

and `D2`

must be within some range for all `i`

. I know how other optimization methods can put expression constraints on `x`

but I need something like that for `D1`

and `D2`

, not `x`

.

My only remotely good thought for achieving this is by putting all of this code inside another optimizer (or even just a for-loop) that changes the *bounds* on `x`

until the criteria is met. This seems extremely inelegant to me.

I feel like there must be a way to put constraints on variables internal to the objective function and not `x`

. I just don't know how to do that.

Thank you in advance.