n4nd0 n4nd0 n4nd0 --- Log opened Sat Feb 18 00:00:19 2012 -!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has joined #shogun 00:00 so is there any way to build a strong classifier with shogun where several "weak" ones are put together? 00:07 I am looking for it to work on an example application on face detection 00:08 I have made tests using fancy features but no good results using svm with a polynomial kernel 00:09 n4nd0, we don't have boosting if you mean that 00:28 try multiboost for that 00:28 sonney2k, I was looking for an alternative solution to boosting 00:29 I want to stick to shogun 00:30 sonney2k, about boosting, I have seen in the gsoc ideas page that one of the suggested projects last year was to merge shogun and multiboost 00:33 no longer there ... 00:33 too ambitious 00:33 aham 00:33 but would that still be of interest for shogun? 00:34 sure 00:35 alright ... sleep time! 00:36 cu 00:36 sonney2k, I have worked with adaboost in particular before 00:36 so I could start taking a look to multiboost and try to see how could that be ported to shogun 00:36 ok 00:36 good night 00:36 -!- n4nd0 [~n4nd0@s83-179-44-135.cust.tele2.se] has quit [Quit: Leaving] 00:51 -!- CIA-64 [~CIA@cia.atheme.org] has joined #shogun 03:00 -!- Netsplit *.net <-> *.split quits: CIA-18 03:03 -!- blackburn [~qdrgsm@109.226.88.39] has joined #shogun 09:08 -!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] 10:09 shogun: Sergey Lisitsyn master * rdd64345 / (5 files in 4 dirs): Fixed a couple of warnings - http://git.io/EduvuQ 10:14 -!- wiking [~wiking@huwico/staff/wiking] has joined #shogun 10:44 wiking: have you tried 1/2-homogay map? 11:01 vedaldi surprisingly reported improvement with this 11:01 what's 1/2-hkm? 11:03 wiking: order of homogaynit 11:05 y 11:05 I mean they use 1/2 as order 11:05 and it smooths things 11:05 but I got worse results 11:06 but how can he do 1/2 order 11:09 when it's an int in their implementation 11:09 :D 11:09 hmm 11:10 wait 11:10 not order 11:10 gamma 11:10 my bad 11:11 k(cx,cy) = c^\gamma k(x,y) 11:11 wiking: using 0.5 for JS improved for 1% 11:20 on my data 11:20 hog 11:20 aha 11:21 compared to original JS? 11:21 or compared to 1.0 gamma ;P 11:21 i'm gonna introduce a new kernel today/tomorrow 11:21 and let's see if i can make that HKM as well 11:21 wiking: JS with gamma = 1.0 and JS with gamma = 0.5 11:22 btw it is pretty dirty hack 11:22 for me 11:22 wiking: jenson-shannon-renyi-mozart? 11:22 which? :) 11:22 ahahah yepp 11:23 :) 11:23 wiking: hmm what is the natural gamma for JS btw? 11:23 none 11:24 i mean this is like basically a tailor series 11:24 so it's just approximation 11:24 so i guess gamma -> inf would get you to original JS 11:24 hmm 11:25 Smaller value of @f$\gamma @f$ enhance the kernel non-linearity and 11:26 are sometimes beneficial in applications 11:26 I guess your suggestion about gamma -> inf wasn't right 11:27 yep 11:27 gamma -> inf makes it LINEAR :) 11:27 and is says that 11:27 or something like that 11:27 oh I don't like it 11:27 makes more params :( 11:27 if gamma = 1 then u should obtain the standard kernels 11:27 the order ->inf would do that what i'm saying 11:28 i mean yeah as it says here 11:28 I see 11:28 The homogeneous kernel map 11:28 approximation is based on periodicizing the kernel 11:28 so basically you could see this 11:28 or at least i would see this as sampling the signal 11:29 and then u have there the Nyquist-Shannon sampling theorem 11:29 and basically period is the sampling rate 11:29 imho 11:29 pretty complex huh 11:29 but yeah I catch it 11:30 wiking: where are you doing your phd btw? 11:42 gent, belgium 11:46 yesterday i've found a fucking paper that partially implements that idea i had lately 11:47 it'll be published next month 11:47 fuckers... from stanford.. 11:47 hah that fucked up feeling 11:47 so you are a big traveller :) 11:48 wiking: will you have enough time for doing latent svms this summer? 11:49 yep 11:49 nice 11:49 i'll be in iceland for a month in july 11:49 doing nothing 11:49 iceland?? 11:49 yeah i was living quite around this world... 11:49 I wish I did the same haha 11:50 nevertoolate 11:50 notenoughmoney 11:50 :D 11:50 u think i had ? :P 11:50 anyhow 11:50 there's this thingy 11:50 will you live in a box? 11:50 :D 11:50 http://www.pascal-network.org/?q=node/19 11:51 i think it's a good idea, and shogun could apply for it 11:51 the only problem is that the program is running out and they don't know if there's going to be a last call for participation or not... 11:51 they going to decide it on the end of march 11:51 and then with this you could travel 11:52 ;) 11:52 ok i'm off now 11:52 see you 11:52 nice program 11:52 i'll be back sometime at night 11:52 but no idea how to apply it for shogun 11:52 ok 11:52 it's easy 11:52 I mean we can hardly find 4-8 guye 11:53 guys 11:53 nono 11:53 easy 11:53 you are one 11:53 :> 11:53 anyhow i'll let you know if they are going to be a last call for participation 11:53 ok 11:53 e.g. Soeren can't participate in this program 11:53 ttyl 11:53 -!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking] 11:54 he is pretty busy with his job 11:54 cu 11:54 :) 11:54 sonne|work: http://www.pascal-network.org/?q=node/19 is it any applicable to shogun? 11:59 hm I forgot it is saturday 13:46 :D 13:46 sonney2k: ^ 13:46 -!- Ram108 [~amma@14.96.26.93] has joined #shogun 14:33 blackburn: there? 14:34 Ram108: yes 14:34 hi i think i figured out what went wrong......... 14:35 well the weight vectors were initialised to large values and the error equation yielded very small values to change the weight vectors appropriately 14:36 thanks for the help.......... 14:36 hah, did I help you anyhow? :) 14:39 lol 14:39 erm do u mind i ask u hw long were u in this field of work? 14:39 i mean machine learning 14:40 hmm 14:41 i am not able to grasp what all topics does this field comprise off 14:41 more than year 14:41 may be 1.5 14:41 oh hmmm thanks :) 14:41 well NN, SVM, fuzzy logic, Genetic algorithms 14:42 is that about it? 14:42 i mean do all the learning algorithms fall under one of these? 14:42 NN + SVM mainly 14:42 genetic stuff stands apart usually 14:43 more generally, evolutionary computing 14:43 fuzzy logic as well 14:43 oh hmmm okay......... 14:43 but there are a lot of intersections everywhere 14:43 ah........ 14:43 shogun contains svms mainly 14:44 i can see that lol 14:44 thats perhaps why am not able to understand all the liblinear lib...... etc etc 14:44 i ll read up on that...... 14:45 by the way could u enlighten me more on what Mr sonney is doing? 14:45 I don't know, he does some research at tomtom 14:46 earlier he was related to bioinformatics 14:47 oh hmmm i c....... and he devout s rest of his spare time building shogun? 14:47 yes 14:49 ok :) last one....... hw old is shogun? 14:49 since 1999 14:50 thanks :) 14:50 -!- Ram108 [~amma@14.96.26.93] has quit [Ping timeout: 240 seconds] 14:55 -!- Ram108 [~amma@14.96.172.24] has joined #shogun 15:12 -!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun 15:33 -!- Ram108 [~amma@14.96.172.24] has quit [Ping timeout: 255 seconds] 16:28 -!- naywhaya1e [~ryan@spoon.lugatgt.org] has joined #shogun 18:35 -!- naywhayare [~ryan@spoon.lugatgt.org] has quit [Ping timeout: 240 seconds] 18:37 -!- blackburn [~qdrgsm@109.226.88.39] has quit [Ping timeout: 240 seconds] 18:37 -!- sonne|work [~sonnenbu@194.78.35.195] has quit [Ping timeout: 240 seconds] 18:37 -!- sonne|work [~sonnenbu@194.78.35.195] has joined #shogun 18:37 -!- blackburn [~qdrgsm@109.226.88.39] has joined #shogun 18:37 -!- Ram108 [~amma@14.99.168.133] has joined #shogun 18:43 n4nd0: hey 19:19 -!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Ping timeout: 252 seconds] 19:19 -!- wiking [~wiking@huwico/staff/wiking] has joined #shogun 20:03 -!- Ram108 [~amma@14.99.168.133] has quit [Quit: Ex-Chat] 20:06 -!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun 21:16 sonney2k: hi! so about what we talked a bit yesterday of adding boosting to shogun 21:28 sonney2k: I would like to first start with adaboost, what do you think it is a good start to do it? 21:29 blackburn: yo 21:39 wiking: hey 21:39 wazza 21:39 are we getting multiclass linear sum soon? 21:40 *svm 21:40 heh 21:40 yes, I'm working on it 21:40 although not very fast 21:40 ok cool 21:40 that'd be great to see the speedup 21:40 yeah liblinear with cramer singer should be fast and accurate 21:41 blackburn: do you know sth about multiboost and the gsoc project that was suggested last year porting it to shogun? 21:42 n4nd0: no, unfortunately no 21:42 blackburn: I started thinking of it since I don't see other way of making the face detector work :( 21:43 do you need a face detector? 21:43 blackburn: I have tried with some fancy haar features for a single svm but no luck 21:43 did you get better results with boosting? 21:43 blackburn: well, not really ... it was more with the idea of making this example we talked about 21:43 blackburn: yes, with the implementation from scratch I did in matlab with boosting of weak classifiers it worked 21:44 what are the numbers? 21:44 if you really need boosting you may work on it, else it is okay to show bad results 21:45 bad results are results as well right? 21:45 haha yes they are 21:45 in any case it could be nice try to get into shogun some of the algorithms in multiboost 21:47 blackburn: or do you think it might be too optimistic to do let say in a couple of non-intense weeks? 21:48 n4nd0: Soeren said it is pretty ambitious 21:49 did he? 21:49 he referred to the gsoc project that was proposed last year, but yes he said that 21:49 I had in mind anyway to limit to adaboost firstly 21:50 these are the parameters of the classifiers I get using a single svm in the CBCL database 21:50 >>>> accuracy   = 0.932669322709 21:50 >>>> precision  = 0.985507246377 21:50 >>>> error rate = 0.0673306772908 21:50 >>>> recall     = 0.28813559322 21:50 >>>> ROC area   = 0.907626093794 21:50 recall is too low 21:51 what are results with your boosting? 21:52 I didn't try it with this database 21:52 ehmm how then can you compare :) 21:53 because we could use that classifier "for real", not in real time though 21:54 but it worked nicely with images 21:54 and, even if I have not tried the svm with real images, with a recall of about 0.288 I don't think it will work fine 21:54 I just wonder if recall is calculated correctly 21:57 hmm seems so 21:59 n4nd0: did you try kernels? 21:59 blackburn: I have used a polynomial one 22:00 gaussian? sigmoid? 22:00 blackburn: I have tried it with degree 2 and 3 22:00 no, I have not tried other ones 22:00 why? 22:00 in the page where I got the CBCL face data I read they had used those kernels and not others 22:01 heh 22:01 I assumed those were the best for this application 22:01 but it might be that they improve 22:02 let me try 22:02 well gaussian works well usually 22:02 n4nd0: btw did you change C? 22:02 normalization? 22:03 these things are considerable steps 22:03 currently I am using a value of 1.0 for C 22:04 I tried changing it to bigger ones but it went worse 22:04 do you mean normalization for the training and test data? 22:04 yes, both should be normalized 22:04 so the images get zero mean and std 1? 22:04 I mean vectors of your features should have L2 norm = 1 for linear kernel 22:05 mmm 22:05 no 22:05 could be better 22:05 the features are not normalized :( 22:05 preprocessor = NormOne() 22:06 preprocessor.apply_to_feature_matrix(train_features) 22:06 preprocessor.apply_to_feature_matrix(test_features) 22:06 something like that 22:06 ok, I will take it a look 22:06 even if the features are the pixel of the images and the images are already normalized, do you think it will make a difference? 22:07 they are normalized as I told you, zero mean and standard deviation equal to one 22:07 I don't know 22:08 better try 22:08 :) 22:08 ok 22:08 blackburn: how is it a good way to choose the parameters for the kernels, such as the width for the Gaussian? 22:09 cross-validation? 22:09 yeah 22:09 blackburn: hey man btw do you know viktor pelevin? he's one of my favorite contemporary writers... and i've just seen that they've made a movie out of one of his books http://www.imdb.com/title/tt0459748/ 22:09 wiking: yes, I've seen this movie :) 22:10 is it good? 22:10 and have read a book as well 22:10 not bad :) 22:10 unfortunately haven't read that book yet from him.. but all the latest ones... 22:10 starting from the yellow arrow 22:10 empire v 22:10 generation P is the only book I've read :) 22:11 from pelevin I mean 22:11 not generally :D 22:11 and sacred book of the werewolf... and i don't know what was the english title for one playing in the big russian revolution time 22:11 hehehe 22:11 anyhow he's really cool 22:11 i'm just getting this movie 22:11 i'm just a bit scared of the subtitles 22:12 is it translated? 22:12 ah 22:12 hey you should understand russian a little :) 22:12 heheh i do a little 22:12 wiking: there are a lot of really good artists 22:13 it's funny when i was living in australia there were a lot of russians around me... i could understand 60-70% of what they were saying... but when i've tried serbian with them they couldn't understand a word... 22:13 epifantsev as tatarsky, efremov as azadovsky.. 22:13 have you been living in australia?? 22:13 yeah 22:13 for almost 2 years 22:14 damn is there any island where you weren't? 22:14 never been to latin america nor africa and neither in asia 22:14 :D 22:14 just been at bangkok airport 3 times :P 22:14 i wanted to do the transiberian 22:14 but that has been postponed... but hoping to do it once soon 22:15 where are u living in russia atm? 22:15 you are welcome at samara/togliatti 22:15 :) 22:15 aaah 22:15 it's by the volga river right? 22:15 yeah 22:15 cool 22:15 how far is the star city from you? :) 22:16 the place 'lada' cars are being made 22:16 star? 22:16 i mean cosmos city 22:16 or what 22:16 in kazakstan 22:16 ah baikonur? 22:16 aaah yeah 22:16 pretty far 22:16 let me check 22:16 fucking hell i'm tired not remembering the name 22:16 oooh fuck 22:17 it's that faaar 22:17 i mean baikonur 22:17 ~1200 22:17 heheheh so the name for the car lada samara is coming from the name of the city ?:))) 22:17 hahah didn't know that one :> 22:18 yes 22:18 VAZ is in togliatti 22:18 that's a funny car :> 22:18 but cars was named after city, right 22:18 sometimes it is called zubilo 22:18 http://upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Lada_Samara.JPG/220px-Lada_Samara.JPG 22:19 http://upload.wikimedia.org/wikipedia/commons/thumb/5/50/ColdChisels.jpg/220px-ColdChisels.jpg 22:19 something similar right? 22:19 yep yep those lada samaras 22:19 i remember seeing them a lot 22:19 where? 22:19 well all around eastern europe 22:19 but since the communism stopped being... 22:20 btw now they do cars like that 22:20 http://wroom.ru/i/cars2/lada_priora_1.jpg 22:20 woah 22:20 high tech shit 22:20 not as ugly 22:20 loved russian cars 22:20 as it was 22:20 they were reliable 22:20 and simple so easy to fix 22:20 not really but it is easy to fix it 22:20 yeah i mean that's why 22:20 with kuvalda 22:21 even if there was something even somebody with a little knowledge could fix it 22:21 uaz 22:21 or what was that truck 22:21 uaz being made a little norther 22:21 in ulyanovsk 22:22 ahhaha they were like survivor machines 22:22 :) 22:22 the uaz bus 22:22 http://upload.wikimedia.org/wikipedia/commons/a/a1/UAZ-Bus.jpg 22:22 bus? I'm not sure 22:22 this oen!!! 22:22 ah 22:22 buhanka 22:22 do u know this one? :) 22:22 this shit is fucking crazy 22:22 that thing was going in any shit :))) 22:22 it is able to climb 75 degree mountain 22:23 :D 22:23 yep 22:23 amazing shit 22:23 I heard one story 22:23 a couple of guys went to hunt a little 22:23 here 22:23 they had land cruiser, etc 22:23 so they sticked in snow 22:23 and there came man on this uaz 22:24 pulled their cars easily and told them to use better cars next time :D 22:24 nothing can mess with an UAZ!!! 22:24 yeah i totally believe 22:24 hah yeah crazy car 22:24 that car is some amazing piece of mechanism 22:25 there are a lot of them still here 22:25 but yeah i really like russian made stuff... i mean they are just bruteforce 22:25 they were never nice and shinny stuff 22:25 but for the purpose it was great :) 22:25 rather soviet 22:25 hahahaha 22:26 yeah 22:26 i just remember the soviet era stuff 22:26 dunno how is it now with russia :> 22:26 how's IT doing it there? 22:26 i mean it should be great in one way 22:26 IT? 22:26 inf. tech. stuff? 22:26 as there's amazing coders coming from russia 22:26 yeah inf.tech 22:26 well it is ok 22:26 for example we have a lot of outsource in samara 22:27 netcracker, epam ,mercury 22:27 I work at netcracker btw 22:27 heheheh i always wondered why people talk about outsourcing to india, when russian coders are way better imho 22:27 yeah I think so :) 22:28 man when i was working with indians in nokia.... 22:28 i really cannot explain 22:28 I can imagine haha 22:28 if i could have i would release all their codes 22:28 in one book? 22:28 :D 22:28 it's like 2 functions they've used for EVERYTHING 22:28 oracle_func_1 22:29 oracle_func_2 22:29 exactly 22:29 i remember cleaning up their shit 22:29 this function oracles everything 22:29 hah 22:29 and the best is 22:29 using 2 files 22:29 .h and .cpp 22:29 not more 22:29 2 22:29 damn how old are you? 22:29 and that was for a browser interface 22:29 you were at every country I know 22:29 :D 22:29 worked in every company? 22:29 :D 22:29 ahahah too old man too old 22:30 30 22:30 ohh it is being clearer now 22:30 hehehe yeah makes sense right? :P 22:30 hah yeah 22:30 but yeah it is fucking crazy cleaning up a 10k+ lines file 22:30 I have 2/3 of your hah 22:30 hahahahahahahha 22:30 little green grasshopper then 22:31 yeah 22:31 blackburn: wow the training with the Gaussian kernel is never ending :-O 22:31 n4nd0: should be a little slower 22:31 hmm.. 22:31 wiking: artist playing tatarsky in generation P had some crazy roles before 22:34 for example 22:34 http://www.youtube.com/watch?feature=player_detailpage&v=jhzpe7QxlGw 22:34 hahahaha 22:35 lets test your russian 22:35 did you get title? :) 22:35 what is ?????? 22:36 ? 22:36 head 22:36 ahhaha 22:36 TIDE or cutting off the head, something like that I guess 22:36 yeah 22:36 i didn't get ???? 22:36 just tide 22:36 cutting was ok 22:36 i got that one 22:37 and of course ili 22:37 :P 22:37 aaah golovi 22:37 yeah 22:37 it's glava on serbian 22:37 i should have got it 22:37 3:10 hah 22:37 :) 22:37 hmm I just wonder what do you think about kosovo 22:38 honestly 22:38 3:46 is ok a well 22:38 no opinion 22:38 i mean it's a big mess 22:38 it's just bad that people cannot agree on it in a normal manner 22:38 but that's quite usual in balkans :P 22:38 as for me it was a great shame for my country to not help serbians there 22:38 i mean that they cannot communicate in a normal manner 22:39 but may be I'm wrong 22:39 well 22:39 it's really not nice for the minorities there (kosovoar people) 22:39 because of the past stories... oppression by the serbs etc 22:39 so i completely understand that part 22:39 it's just funny when people from around the world who have no idea about anything 22:40 try to fix it 22:40 :) 22:40 but when you look at their 'own mess' it's even worse in a way 22:40 so a lot of my friends (me as well) think 'kosovo je serbia' :) 22:40 like the guy who sketched up the 'solution' for kosovo is from finland 22:40 funny thing I don't think chechnya should be included 22:41 into russia 22:41 and for instance the situation with russians in finland (on the border) 22:41 it's like wtf 22:41 i mean finnish people just amazingly hating russians... :( 22:41 hehehe yeah you have some troubles of your own as well 22:42 for what? 22:42 well i don't know 22:42 it's just something from the past 22:42 do you know how they 'solved' chechnya problem? 22:42 njet molotoff hah 22:42 hahahahahha 22:42 i mean their hate is irrational (finnish) 22:43 they hate russians because they tried to invade finland couple of times 22:43 hah yeah 22:43 but they have almost no real problems with the swedes 22:43 who kind of like ruled them for 100+ years 22:43 so it's amazing how unbalanced that shit is 22:44 I know no one hating finnish :) 22:44 but it's all the same with those countries there in the baltics 22:44 estonia/latvia/lithuania same 22:44 yeah 22:44 they hate their soviet legacy 22:44 yeah 22:44 but it's part of their culture 22:44 and identity 22:44 so funny to hate something that is part of u 22:45 :> 22:45 btw currently we don't like soviet state of mind as well 22:45 ahhahahah 22:45 well 22:45 i don't know which was better 22:45 i mean don't get me wrong 22:45 i don't know that much of current state of russia 22:45 but the thing with putin and yelcin 22:45 so the situation with putin clearly describes what I mean 22:46 people here want vozhd 22:46 who will rule them 22:46 yeah i kind of like sensed that one... that in russia some people just want a big leader 22:46 30% do 22:46 something like stalin 22:46 + some falsification 22:46 or breznyev :P 22:46 and here we go, putin again 22:47 ehehheheh 22:47 but that's amazing 22:47 i mean the whole thing around putin 22:47 the oligarch 22:47 it's like a big fucking maffia 22:47 exactly it is 22:47 especially with gazprom 22:47 some day he will be judged 22:47 particularly yukos as well 22:48 i mean on the other hand if u look what was happening with yelcin... 22:48 khodorkovsky 22:48 yeltsin was worse for sure 22:48 i mean that was amazing how the things gone really bad with yeltsin 22:48 everything started to get wasted... 22:48 but there was a big bonus for russia 22:49 cost of oil 22:49 heheh yeah 22:49 it impacted everything 22:49 if there was 30\$ for barrel - there would be no way for such 'great putin' 22:49 would be only wastelands here :) 22:49 :P 22:50 wiking: do you know how they trying to calm down people here? 22:50 there were protests after elections 22:51 a lot of 22:51 yeah 22:51 read about those 22:51 they just grow hate to US 22:51 :D 22:51 they say US wants to do revolution here 22:51 no way! 22:51 hahah 22:51 ahahahah 22:51 yes, they say we don't want another lebanon here 22:51 or siria 22:51 or egypt 22:51 it works for not-too-smart-people 22:52 but more educated city people usually just laughs at it 22:52 sorry but i gotta run again... would love to continue this conversation some time soon 22:52 :( 22:52 aha okay :) 22:53 but yeah fuck putin :P 22:53 was great to talk to you 22:53 hahah 22:53 yeah you too! 22:53 laterz! 22:53 see you 22:53 cya 22:53 blackburn: should it be better if I increase the cache-size parameter? 22:57 right now is to 40, and it has been training the svm for a long long while 22:57 n4nd0: yes, should be 22:58 still?? 22:58 that's crazy :) 22:58 yeah I know :-P 22:58 but it is also because the size of the images changed when I tried with some fancy haar features 22:59 I am going to come back to the 19x19 and increase the size cache 22:59 which one is a good value for it? 22:59 it is size of cache in mb 22:59 you may use any that fits into your memory 22:59 and for the width in the Gaussian? 23:02 an approximate value that should go good? 23:02 is there any heuristic or thumb rule to use? 23:02 well not really 23:03 it should be very small and very large 23:03 :D 23:03 shouldn't* 23:03 I don't know any good 23:03 I will try with 20 then 23:05 blackburn: nothing good with the Gaussian kernel :( 23:34 bad 23:35 by the way 23:48 I don't think I got clearly the idea behind the two parameters for the kernel constructors 23:49 I mean the ones that are called 23:49 CDotFeatures * l, CDotFeatures * r 23:49 so far I am using the same for both 23:49 feats_train, feats_train 23:49 but feels weird to do it that way 23:50 blackburn: should they be different things? 23:53 n4nd0: no 23:53 ok when you train classifier 23:53 you need k_ij between train features and train features 23:54 but when you classify 23:54 you need kernel values between train features and test features 23:54 aham 23:54 that thing is going on when you call apply 23:54 it inits kernel with 23:54 feats_train, feats_test 23:55 and when do things according alphas/support vectors 23:55 actually I am not changing anything new when I call apply 23:56 apply does 23:56 I do sth like 23:56 kernel        = GaussianKernel(feats_train, feats_train, width, size_cache) 23:57 svm = LibSVM(C, kernel, labels_train) 23:57 svm.train() 23:57 output  = svm.apply(feats_test) 23:57 svm.apply() changes kernel 23:57 it inits kernel with feats train and feats test 23:57 svm.apply(feats_train) I mean 23:57 ok 23:58 so there is nothing I should change? 23:58 is it done automatically? 23:58 yes 23:58 it is the same if you 23:58 did 23:58 kernel.init(feats_train,feats_test) 23:58 output = svm.apply() 23:58 without anytihng in apply() 23:59 cool I get it 23:59 not very clear design here 23:59 I have to go now for a while 23:59 be back later 23:59 bye 23:59 but no idea how to do it flexible and in better way 23:59 bye 23:59 --- Log closed Sun Feb 19 00:00:19 2012