1.梯度下降
1.1批梯度下降
eg1:用梯度下降法确定h(x)=x^2-t*x-t中参数t的值
注意迭代因子的选择很重要QAQ,如果程序结果成了发散的就要看看是不是迭代因子选的不好。【最后那个-0.01是无意中试出来的QwQ
1 def hypo(t,x): #precise answer : t=2 2 return (x*x-t*x-t) 3 4 def cost(t): 5 tmp=0 6 for i in range(0,num): 7 tmp+=(yy[i]-hypo(t,xx[i]))*xx[i] 8 return tmp 9 10 xx=[-2,-1, 0, 1, 2, 3, 4] #xx[]和yy[]是样本 11 yy=[ 6, 1,-2,-3,-2, 1, 6] 12 num=7 13 14 eps=0.00000000001 #精度 15 aa=-0.01 #迭代因子 16 17 tx=9999 18 ty=0 19 while(abs(tx-ty)>=eps): 20 tx=ty 21 ty=tx+aa*cost(tx) 22 print(ty) 23 24 print(tx,ty) 25
迭代结果:
1 0.84 2 1.3272 3 1.609776 4 1.77367008 5 1.8687286464 6 1.923862614912 7 1.95584031664896 8 1.9743873836563968 9 1.9851446825207102 10 1.991383915862012 11 1.9950026711999669 12 1.9971015492959807 13 1.9983188985916687 14 1.9990249611831679 15 1.9994344774862374 16 1.9996719969420178 17 1.9998097582263703 18 1.9998896597712947 19 1.999936002667351 20 1.9999628815470636 21 1.9999784712972968 22 1.999987513352432 23 1.9999927577444105 24 1.999995799491758 25 1.9999975637052196 26 1.9999985869490273 27 1.9999991804304358 28 1.9999995246496527 29 1.9999997242967986 30 1.9999998400921433 31 1.9999999072534431 32 1.999999946206997 33 1.9999999688000583 34 1.9999999819040337 35 1.9999999895043397 36 1.999999993912517 37 1.9999999964692599 38 1.9999999979521708 39 1.999999998812259 40 1.9999999993111102 41 1.9999999996004438 42 1.9999999997682574 43 1.9999999998655893 44 1.9999999999220417 45 1.9999999999547842 46 1.9999999999737748 47 1.9999999999847893 48 1.9999999999911777
1.2随机梯度下降
eg2:
时间: 2024-10-25 09:12:46