My assignment was that I needed to write a function firstDeriv( ) using the two point formula [f(x+h)-f(x)]/h and the output of the function is an array of sample of calculated derivative of the input function. The array has (upperbound-lowerbound)/0.1 +1 points(given a 0.1 spacing between points)
I have the answer I just dont understand it so if someone can explain to me what is going on in the for loop.
this is my code:
I call this function in a separate script file:
heres the function code in a seperate script file:
function [ deriv ] = firstDeriv( input, lower_bound,upward_bound )
N=((upward_bound-lower_bound)/0.1)+1 %total sample numbers
for i=1:N-1 %index always starts from 1 not 0
my question is why does the "deriv(i)" equation have f(x)=input(lower_bound+(i-1)*h) ? why is it not just input(x+h)?
what is the thought process behind that equation im not comprehending?