How to improve RBF neural network interpolation ability?

3 views (last 30 days)
Hi! There are 55 sampels, 2 inputs and 1 output. The predictions at samples are exellent, but the interpolation are very poor. How can I improve the RBF NN for interpolaion? Thanks!
clc,clear
% samples
data=[ 550 50.9114 4.9195
550 53.9864 4.8926
550 56.3262 4.8658
550 58.5372 4.5436
550 59.9610 3.3356
525 46.3723 4.5689
525 50.6499 4.5638
525 52.6551 4.5607
525 53.6585 4.5168
525 54.0599 4.4899
525 55.1983 4.3624
525 56.7400 4.1007
525 57.6911 3.1812
500 42.2350 4.1678
500 47.3824 4.1141
500 49.3885 4.0604
500 52.0637 3.9597
500 54.0059 3.7248
500 54.6201 2.9664
475 38.7662 3.7584
475 43.1781 3.7248
475 45.1173 3.6711
475 47.5921 3.5638
475 50.0018 3.3557
475 50.6136 2.7383
450 34.8298 3.3423
450 37.0355 3.3356
450 40.7792 3.2886
450 43.9895 3.1678
450 45.7969 3.0000
450 46.3401 2.4899
400 28.0909 2.6476
400 31.3660 2.6376
400 33.3715 2.6107
400 35.9125 2.5436
400 37.9872 2.3826
400 38.5303 1.8792
350 19.5438 2.1477
350 24.2224 2.1544
350 27.0301 2.1208
350 29.2368 2.0604
350 31.1102 1.9463
350 31.7844 1.5973
300 14.3362 1.8054
300 19.3493 1.7852
300 21.4217 1.7651
300 23.4944 1.7181
300 25.2336 1.6376
300 26.5071 1.4228
250 11.1324 1.5369
250 15.6110 1.5101
250 17.2824 1.4832
250 18.3522 1.4631
250 19.2214 1.4430
250 21.8308 1.2819 ];
p=data(:,[1 3])';
t=data(:,2)';
% input data for interpolation
data1=[ 425 2.3000
425 2.3500
425 2.4000
425 2.4500
425 2.5000
425 2.5500
425 2.6000
425 2.6500
425 2.7000
425 2.7500
425 2.8000
425 2.8500
425 2.9000
425 2.9500
425 3.0000 ];
inputInterp=data1';
net=newrbe(p,t);
outTrain=sim(net,p);
% comparison of samples and predictions
plot(outTrain,p(2,:),'o',t,p(2,:),'r*')
hold on
% interpolation
% outInterp=sim(net,inputInterp);
% plot(outInterp,inputInterp(2,:),'o')

Accepted Answer

Greg Heath
Greg Heath on 10 Aug 2013
1. Standarize the inputs to have zero-mean and unit-variance
2. Design two nets
a. Odd indices 1:2:55
b. Even indices 2:2:54
3. Determine good spread values
a. Vary the spread on the odds and test with the evens
b. Vary the spread on the evens and test with the odds
c. Choose a good spread value for a combined data design
4. Test with the normalized non-design data1
Thank you for formally accepting my answer
Greg
  1 Comment
huan tu
huan tu on 11 Aug 2013
Edited: huan tu on 11 Aug 2013
  1. I have normalized the training data and the interpolation inputs, but the result is not as desired.
  2. The variance in Guassian function is also called smoothing factor, is it the same meaning with 'spread' in function net=newrbe(P,T,spread)?
  3. I don't understand how to design two nets to predict. Could you show me the codes if possible? Thank you very much!
clc,clear
% samples
data=[ 550 50.9114 4.9195
550 53.9864 4.8926
550 56.3262 4.8658
550 58.5372 4.5436
550 59.9610 3.3356
500 42.2350 4.1678
500 47.3824 4.1141
500 49.3885 4.0604
500 52.0637 3.9597
500 54.0059 3.7248
500 54.6201 2.9664
475 38.7662 3.7584
475 43.1781 3.7248
475 45.1173 3.6711
475 47.5921 3.5638
475 50.0018 3.3557
475 50.6136 2.7383
450 34.8298 3.3423
450 37.0355 3.3356
450 40.7792 3.2886
450 43.9895 3.1678
450 45.7969 3.0000
450 46.3401 2.4899
400 28.0909 2.6476
400 31.3660 2.6376
400 33.3715 2.6107
400 35.9125 2.5436
400 37.9872 2.3826
400 38.5303 1.8792
350 19.5438 2.1477
350 24.2224 2.1544
350 27.0301 2.1208
350 29.2368 2.0604
350 31.1102 1.9463
350 31.7844 1.5973
300 14.3362 1.8054
300 19.3493 1.7852
300 21.4217 1.7651
300 23.4944 1.7181
300 25.2336 1.6376
300 26.5071 1.4228 ];
data=[data(:,1) data(:,3) data(:,2)];
p=data(:,[1 3])';
t=data(:,2)';
% interpolation validation
data1= [525.0000 46.3723
525.0000 50.6499
525.0000 52.6551
525.0000 53.6585
525.0000 54.0599
525.0000 55.1983
525.0000 56.7400
525.0000 57.6911];
dataInterp=data1';
[normInput,ps]=mapminmax(p);
[normTarget,ts]=mapminmax(t);
normInterp=mapminmax('apply',dataInterp,ps);
net=newrbe(normInput,normTarget);
normTrainOutput=sim(net,normInput);
normInterpOutput=sim(net,normInterp);
TrainOutput=mapminmax('reverse',normTrainOutput,ts);
InterpOutput=mapminmax('reverse',normInterpOutput,ts);
figure(1)
% plot predictions at samples
plot(p(2,:),TrainOutput,'o',...
p(2,:),t,'r*',...
dataInterp(2,:),InterpOutput,'o')
hold on
% plot interpolation
plot(dataInterp(2,:),InterpOutput,'o')
refInterp=[525.0000 4.5638 46.3723
525.0000 4.5705 50.6499
525.0000 4.5638 52.6551
525.0000 4.5168 53.6585
525.0000 4.4899 54.0599
525.0000 4.3624 55.1983
525.0000 4.1007 56.7400
525.0000 3.1812 57.6911]';
plot(refInterp(3,:),refInterp(2,:),'rd','MarkerFaceColor','r')

Sign in to comment.

More Answers (1)

Greg Heath
Greg Heath on 11 Aug 2013
Edited: Greg Heath on 11 Aug 2013
%1. I have normalized the training data and the interpolation inputs, but the result is not as desired.
That is no help. Show your new code.
% 2. The variance in Guassian function is also called smoothing factor, is it the same meaning with 'spread' in function net=newrbe(P,T,spread)?
No. Unfortunately, the help documentation on newrbe stinks. Try the doc command
doc newrbe
Maybe help and doc on the other rbf functions will help ... See Also
newgrnn | newpnn | newrb |
%3. I don't understand how to design two nets to predict. Could you show me the codes if possible? Thank you very much!
Very simple.
Just divide the data in half (odd/even).
Loop over spread values
Design a net with the evens
Test with the odds
Choose the best spread value
Repeat with evens and odds reversed in design and test roles.
Combine those 2 "best" spread values
Use the combined value to design a net with all of the data.
Now test all 3 nets on data1.
Thank you for formally accepting my answer
Greg
  2 Comments
huan tu
huan tu on 11 Aug 2013
Edited: huan tu on 11 Aug 2013
I found following describtion by searching 'rbf spread' in matlab help browser: All of the radial basis functions also have an associated width parameter 'sigma',which is related to the spread of the function around its center. Selecting the box in the model setup provides a default setting for the width. The default width is the average over the centers of the distance of each center to its nearest neighbor. This is a heuristic given in Hassoun (see References) for Gaussians, but it is only a rough guide that provides a starting point for the width selection algorithm.
Does it mean 'spread' is start value for training the 'variance' in RBF learning algorithm? Or something else?
Greg Heath
Greg Heath on 12 Aug 2013
No. Spread stays constant. I vary it ( and most trial and error searches) by using a geometric grid [ ... 0.25, 0.5, 1, 2, 4, ...] for multipliers of the default value given in the documentation.
For each value I create 10 designs by using different random initial weights.
After debugging, the code should run in less than a few minutes.
You can probably find something useful by searching on
greg spread
HTH
Greg

Sign in to comment.

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!