Open in new window / Try shogun cloud
--- Log opened Sat Feb 18 00:00:19 2012
-!- n4nd0 [] has joined #shogun00:00
n4nd0so is there any way to build a strong classifier with shogun where several "weak" ones are put together?00:07
n4nd0I am looking for it to work on an example application on face detection00:08
n4nd0I have made tests using fancy features but no good results using svm with a polynomial kernel00:09
@sonney2kn4nd0, we don't have boosting if you mean that00:28
@sonney2ktry multiboost for that00:28
n4nd0sonney2k, I was looking for an alternative solution to boosting00:29
n4nd0I want to stick to shogun00:30
n4nd0sonney2k, about boosting, I have seen in the gsoc ideas page that one of the suggested projects last year was to merge shogun and multiboost00:33
@sonney2kno longer there ...00:33
@sonney2ktoo ambitious00:33
n4nd0but would that still be of interest for shogun?00:34
@sonney2kalright ... sleep time!00:36
n4nd0sonney2k, I have worked with adaboost in particular before00:36
n4nd0so I could start taking a look to multiboost and try to see how could that be ported to shogun00:36
n4nd0good night00:36
-!- n4nd0 [] has quit [Quit: Leaving]00:51
-!- CIA-64 [] has joined #shogun03:00
-!- Netsplit *.net <-> *.split quits: CIA-1803:03
-!- blackburn [~qdrgsm@] has joined #shogun09:08
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]10:09
CIA-64shogun: Sergey Lisitsyn master * rdd64345 / (5 files in 4 dirs): Fixed a couple of warnings -
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun10:44
blackburnwiking: have you tried 1/2-homogay map?11:01
blackburnvedaldi surprisingly reported improvement with this11:01
wikingwhat's 1/2-hkm?11:03
blackburnwiking: order of homogaynit11:05
blackburnI mean they use 1/2 as order11:05
blackburnand it smooths things11:05
blackburnbut I got worse results11:06
wikingbut how can he do 1/2 order11:09
wikingwhen it's an int in their implementation11:09
blackburnnot order11:10
blackburnmy bad11:11
blackburnk(cx,cy) = c^\gamma k(x,y)11:11
blackburnwiking: using 0.5 for JS improved for 1%11:20
blackburnon my data11:20
wikingcompared to original JS?11:21
wikingor compared to 1.0 gamma ;P11:21
wikingi'm gonna introduce a new kernel today/tomorrow11:21
wikingand let's see if i can make that HKM as well11:21
blackburnwiking: JS with gamma = 1.0 and JS with gamma = 0.511:22
blackburnbtw it is pretty dirty hack11:22
blackburnfor me11:22
blackburnwiking: jenson-shannon-renyi-mozart?11:22
wikingwhich? :)11:22
wikingahahah yepp11:23
blackburnwiking: hmm what is the natural gamma for JS btw?11:23
wikingi mean this is like basically a tailor series11:24
wikingso it's just approximation11:24
wikingso i guess gamma -> inf would get you to original JS11:24
wikingSmaller value of @f$ \gamma @f$ enhance the kernel non-linearity and11:26
wikingare sometimes beneficial in applications11:26
blackburnI guess your suggestion about gamma -> inf wasn't right11:27
blackburngamma -> inf makes it LINEAR :)11:27
wikingand is says that11:27
blackburnor something like that11:27
blackburnoh I don't like it11:27
blackburnmakes more params :(11:27
wikingif gamma = 1 then u should obtain the standard kernels11:27
wikingthe order ->inf would do that what i'm saying11:28
wikingi mean yeah as it says here11:28
blackburnI see11:28
wikingThe homogeneous kernel map11:28
wikingapproximation is based on periodicizing the kernel11:28
wikingso basically you could see this11:28
wikingor at least i would see this as sampling the signal11:29
wikingand then u have there the Nyquist-Shannon sampling theorem11:29
wikingand basically period is the sampling rate11:29
blackburnpretty complex huh11:29
blackburnbut yeah I catch it11:30
blackburnwiking: where are you doing your phd btw?11:42
wikinggent, belgium11:46
wikingyesterday i've found a fucking paper that partially implements that idea i had lately11:47
wikingit'll be published next month11:47
wikingfuckers... from stanford..11:47
blackburnhah that fucked up feeling11:47
blackburnso you are a big traveller :)11:48
blackburnwiking: will you have enough time for doing latent svms this summer?11:49
wikingi'll be in iceland for a month in july11:49
wikingdoing nothing11:49
wikingyeah i was living quite around this world...11:49
blackburnI wish I did the same haha11:50
wikingu think i had ? :P11:50
wikingthere's this thingy11:50
blackburnwill you live in a box?11:50
wikingi think it's a good idea, and shogun could apply for it11:51
wikingthe only problem is that the program is running out and they don't know if there's going to be a last call for participation or not...11:51
wikingthey going to decide it on the end of march11:51
wikingand then with this you could travel11:52
wikingok i'm off now11:52
blackburnsee you11:52
blackburnnice program11:52
wikingi'll be back sometime at night11:52
blackburnbut no idea how to apply it for shogun11:52
wikingit's easy11:52
blackburnI mean we can hardly find 4-8 guye11:53
wikingyou are one11:53
wikinganyhow i'll let you know if they are going to be a last call for participation11:53
blackburne.g. Soeren can't participate in this program11:53
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]11:54
blackburnhe is pretty busy with his job11:54
blackburnsonne|work: is it any applicable to shogun?11:59
blackburnhm I forgot it is saturday13:46
blackburnsonney2k: ^13:46
-!- Ram108 [~amma@] has joined #shogun14:33
Ram108blackburn: there?14:34
blackburnRam108: yes14:34
Ram108hi i think i figured out what went wrong.........14:35
Ram108well the weight vectors were initialised to large values and the error equation yielded very small values to change the weight vectors appropriately14:36
Ram108thanks for the help..........14:36
blackburnhah, did I help you anyhow? :)14:39
Ram108erm do u mind i ask u hw long were u in this field of work?14:39
Ram108i mean machine learning14:40
Ram108i am not able to grasp what all topics does this field comprise off14:41
blackburnmore than year14:41
blackburnmay be 1.514:41
Ram108oh hmmm thanks :)14:41
Ram108well NN, SVM, fuzzy logic, Genetic algorithms14:42
Ram108is that about it?14:42
Ram108i mean do all the learning algorithms fall under one of these?14:42
blackburnNN + SVM mainly14:42
blackburngenetic stuff stands apart usually14:43
blackburnmore generally, evolutionary computing14:43
blackburnfuzzy logic as well14:43
Ram108oh hmmm okay.........14:43
blackburnbut there are a lot of intersections everywhere14:43
blackburnshogun contains svms mainly14:44
Ram108i can see that lol14:44
Ram108thats perhaps why am not able to understand all the liblinear lib...... etc etc14:44
Ram108i ll read up on that......14:45
Ram108by the way could u enlighten me more on what Mr sonney is doing?14:45
blackburnI don't know, he does some research at tomtom14:46
blackburnearlier he was related to bioinformatics14:47
Ram108oh hmmm i c....... and he devout s rest of his spare time building shogun?14:47
Ram108ok :) last one....... hw old is shogun?14:49
blackburnsince 199914:50
Ram108thanks :)14:50
-!- Ram108 [~amma@] has quit [Ping timeout: 240 seconds]14:55
-!- Ram108 [~amma@] has joined #shogun15:12
-!- n4nd0 [] has joined #shogun15:33
-!- Ram108 [~amma@] has quit [Ping timeout: 255 seconds]16:28
-!- naywhaya1e [] has joined #shogun18:35
-!- naywhayare [] has quit [Ping timeout: 240 seconds]18:37
-!- blackburn [~qdrgsm@] has quit [Ping timeout: 240 seconds]18:37
-!- sonne|work [~sonnenbu@] has quit [Ping timeout: 240 seconds]18:37
-!- sonne|work [~sonnenbu@] has joined #shogun18:37
-!- blackburn [~qdrgsm@] has joined #shogun18:37
-!- Ram108 [~amma@] has joined #shogun18:43
blackburnn4nd0: hey19:19
-!- n4nd0 [] has quit [Ping timeout: 252 seconds]19:19
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun20:03
-!- Ram108 [~amma@] has quit [Quit: Ex-Chat]20:06
-!- n4nd0 [] has joined #shogun21:16
n4nd0sonney2k: hi! so about what we talked a bit yesterday of adding boosting to shogun21:28
n4nd0sonney2k: I would like to first start with adaboost, what do you think it is a good start to do it?21:29
wikingblackburn: yo21:39
blackburnwiking: hey21:39
wikingare we getting multiclass linear sum soon?21:40
blackburnyes, I'm working on it21:40
blackburnalthough not very fast21:40
wikingok cool21:40
wikingthat'd be great to see the speedup21:40
blackburnyeah liblinear with cramer singer should be fast and accurate21:41
n4nd0blackburn: do you know sth about multiboost and the gsoc project that was suggested last year porting it to shogun?21:42
blackburnn4nd0: no, unfortunately no21:42
n4nd0blackburn: I started thinking of it since I don't see other way of making the face detector work :(21:43
blackburndo you need a face detector?21:43
n4nd0blackburn: I have tried with some fancy haar features for a single svm but no luck21:43
blackburndid you get better results with boosting?21:43
n4nd0blackburn: well, not really ... it was more with the idea of making this example we talked about21:43
n4nd0blackburn: yes, with the implementation from scratch I did in matlab with boosting of weak classifiers it worked21:44
blackburnwhat are the numbers?21:44
blackburnif you really need boosting you may work on it, else it is okay to show bad results21:45
blackburnbad results are results as well right?21:45
n4nd0haha yes they are21:45
n4nd0in any case it could be nice try to get into shogun some of the algorithms in multiboost21:47
n4nd0blackburn: or do you think it might be too optimistic to do let say in a couple of non-intense weeks?21:48
blackburnn4nd0: Soeren said it is pretty ambitious21:49
blackburndid he?21:49
n4nd0he referred to the gsoc project that was proposed last year, but yes he said that21:49
n4nd0I had in mind anyway to limit to adaboost firstly21:50
n4nd0these are the parameters of the classifiers I get using a single svm in the CBCL database21:50
n4nd0>>>> accuracy   = 0.93266932270921:50
n4nd0>>>> precision  = 0.98550724637721:50
n4nd0>>>> error rate = 0.067330677290821:50
n4nd0>>>> recall     = 0.2881355932221:50
n4nd0>>>> ROC area   = 0.90762609379421:50
n4nd0recall is too low21:51
blackburnwhat are results with your boosting?21:52
n4nd0I didn't try it with this database21:52
blackburnehmm how then can you compare :)21:53
n4nd0because we could use that classifier "for real", not in real time though21:54
n4nd0but it worked nicely with images21:54
n4nd0and, even if I have not tried the svm with real images, with a recall of about 0.288 I don't think it will work fine21:54
blackburnI just wonder if recall is calculated correctly21:57
blackburnhmm seems so21:59
blackburnn4nd0: did you try kernels?21:59
n4nd0blackburn: I have used a polynomial one22:00
blackburngaussian? sigmoid?22:00
n4nd0blackburn: I have tried it with degree 2 and 322:00
n4nd0no, I have not tried other ones22:00
n4nd0in the page where I got the CBCL face data I read they had used those kernels and not others22:01
n4nd0I assumed those were the best for this application22:01
n4nd0but it might be that they improve22:02
n4nd0let me try22:02
blackburnwell gaussian works well usually22:02
blackburnn4nd0: btw did you change C?22:02
blackburnthese things are considerable steps22:03
n4nd0currently I am using a value of 1.0 for C22:04
n4nd0I tried changing it to bigger ones but it went worse22:04
n4nd0do you mean normalization for the training and test data?22:04
blackburnyes, both should be normalized22:04
n4nd0so the images get zero mean and std 1?22:04
blackburnI mean vectors of your features should have L2 norm = 1 for linear kernel22:05
blackburncould be better22:05
n4nd0the features are not normalized :(22:05
blackburnpreprocessor = NormOne()22:06
blackburnsomething like that22:06
n4nd0ok, I will take it a look22:06
n4nd0even if the features are the pixel of the images and the images are already normalized, do you think it will make a difference?22:07
n4nd0they are normalized as I told you, zero mean and standard deviation equal to one22:07
blackburnI don't know22:08
blackburnbetter try22:08
n4nd0blackburn: how is it a good way to choose the parameters for the kernels, such as the width for the Gaussian?22:09
wikingblackburn: hey man btw do you know viktor pelevin? he's one of my favorite contemporary writers... and i've just seen that they've made a movie out of one of his books
blackburnwiking: yes, I've seen this movie :)22:10
wikingis it good?22:10
blackburnand have read a book as well22:10
blackburnnot bad :)22:10
wikingunfortunately haven't read that book yet from him.. but all the latest ones...22:10
wikingstarting from the yellow arrow22:10
wikingempire v22:10
blackburngeneration P is the only book I've read :)22:11
blackburnfrom pelevin I mean22:11
blackburnnot generally :D22:11
wikingand sacred book of the werewolf... and i don't know what was the english title for one playing in the big russian revolution time22:11
wikinganyhow he's really cool22:11
wikingi'm just getting this movie22:11
wikingi'm just a bit scared of the subtitles22:12
blackburnis it translated?22:12
blackburnhey you should understand russian a little :)22:12
wikingheheh i do a little22:12
blackburnwiking: there are a lot of really good artists22:13
wikingit's funny when i was living in australia there were a lot of russians around me... i could understand 60-70% of what they were saying... but when i've tried serbian with them they couldn't understand a word...22:13
blackburnepifantsev as tatarsky, efremov as azadovsky..22:13
blackburnhave you been living in australia??22:13
wikingfor almost 2 years22:14
blackburndamn is there any island where you weren't?22:14
wikingnever been to latin america nor africa and neither in asia22:14
wikingjust been at bangkok airport 3 times :P22:14
wikingi wanted to do the transiberian22:14
wikingbut that has been postponed... but hoping to do it once soon22:15
wikingwhere are u living in russia atm?22:15
blackburnyou are welcome at samara/togliatti22:15
wikingit's by the volga river right?22:15
wikinghow far is the star city from you? :)22:16
blackburnthe place 'lada' cars are being made22:16
wikingi mean cosmos city22:16
wikingor what22:16
wikingin kazakstan22:16
blackburnah baikonur?22:16
wikingaaah yeah22:16
blackburnpretty far22:16
blackburnlet me check22:16
wikingfucking hell i'm tired not remembering the name22:16
wikingoooh fuck22:17
wikingit's that faaar22:17
wikingi mean baikonur22:17
wikingheheheh so the name for the car lada samara is coming from the name of the city ?:)))22:17
wikinghahah didn't know that one :>22:18
blackburnVAZ is in togliatti22:18
wikingthat's a funny car :>22:18
blackburnbut cars was named after city, right22:18
blackburnsometimes it is called zubilo22:18
blackburnsomething similar right?22:19
wikingyep yep those lada samaras22:19
wikingi remember seeing them a lot22:19
wikingwell all around eastern europe22:19
wikingbut since the communism stopped being...22:20
blackburnbtw now they do cars like that22:20
wikinghigh tech shit22:20
blackburnnot as ugly22:20
wikingloved russian cars22:20
blackburnas it was22:20
wikingthey were reliable22:20
wikingand simple so easy to fix22:20
blackburnnot really but it is easy to fix it22:20
wikingyeah i mean that's why22:20
blackburnwith kuvalda22:21
wikingeven if there was something even somebody with a little knowledge could fix it22:21
wikingor what was that truck22:21
blackburnuaz being made a little norther22:21
blackburnin ulyanovsk22:22
wikingahhaha they were like survivor machines22:22
wikingthe uaz bus22:22
blackburnbus? I'm not sure22:22
wikingthis oen!!!22:22
wikingdo u know this one? :)22:22
blackburnthis shit is fucking crazy22:22
wikingthat thing was going in any shit :)))22:22
blackburnit is able to climb 75 degree mountain22:23
wikingamazing shit22:23
blackburnI heard one story22:23
blackburna couple of guys went to hunt a little22:23
blackburnthey had land cruiser, etc22:23
blackburnso they sticked in snow22:23
blackburnand there came man on this uaz22:24
blackburnpulled their cars easily and told them to use better cars next time :D22:24
wikingnothing can mess with an UAZ!!!22:24
wikingyeah i totally believe22:24
blackburnhah yeah crazy car22:24
wikingthat car is some amazing piece of mechanism22:25
blackburnthere are a lot of them still here22:25
wikingbut yeah i really like russian made stuff... i mean they are just bruteforce22:25
wikingthey were never nice and shinny stuff22:25
wikingbut for the purpose it was great :)22:25
blackburnrather soviet22:25
wikingi just remember the soviet era stuff22:26
wikingdunno how is it now with russia :>22:26
wikinghow's IT doing it there?22:26
wikingi mean it should be great in one way22:26
blackburninf. tech. stuff?22:26
wikingas there's amazing coders coming from russia22:26
wikingyeah inf.tech22:26
blackburnwell it is ok22:26
blackburnfor example we have a lot of outsource in samara22:27
blackburnnetcracker, epam ,mercury22:27
blackburnI work at netcracker btw22:27
wikingheheheh i always wondered why people talk about outsourcing to india, when russian coders are way better imho22:27
blackburnyeah I think so :)22:28
wikingman when i was working with indians in nokia....22:28
wikingi really cannot explain22:28
blackburnI can imagine haha22:28
wikingif i could have i would release all their codes22:28
blackburnin one book?22:28
wikingit's like 2 functions they've used for EVERYTHING22:28
wikingi remember cleaning up their shit22:29
blackburnthis function oracles everything22:29
wikingand the best is22:29
wikingusing 2 files22:29
wiking.h and .cpp22:29
wikingnot more22:29
blackburndamn how old are you?22:29
wikingand that was for a browser interface22:29
blackburnyou were at every country I know22:29
blackburnworked in every company?22:29
wikingahahah too old man too old22:30
blackburnohh it is being clearer now22:30
wikinghehehe yeah makes sense right? :P22:30
blackburnhah yeah22:30
wikingbut yeah it is fucking crazy cleaning up a 10k+ lines file22:30
blackburnI have 2/3 of your hah22:30
wikinglittle green grasshopper then22:31
n4nd0blackburn: wow the training with the Gaussian kernel is never ending :-O22:31
blackburnn4nd0: should be a little slower22:31
blackburnwiking: artist playing tatarsky in generation P had some crazy roles before22:34
blackburnfor example22:34
blackburnlets test your russian22:35
blackburndid you get title? :)22:35
wikingwhat is ??????22:36
blackburnTIDE or cutting off the head, something like that I guess22:36
wikingi didn't get ????22:36
blackburnjust tide22:36
wikingcutting was ok22:36
wikingi got that one22:37
wikingand of course ili22:37
wikingaaah golovi22:37
wikingit's glava on serbian22:37
wikingi should have got it22:37
blackburn3:10 hah22:37
blackburnhmm I just wonder what do you think about kosovo22:38
blackburn3:46 is ok a well22:38
wikingno opinion22:38
wikingi mean it's a big mess22:38
wikingit's just bad that people cannot agree on it in a normal manner22:38
wikingbut that's quite usual in balkans :P22:38
blackburnas for me it was a great shame for my country to not help serbians there22:38
wikingi mean that they cannot communicate in a normal manner22:39
blackburnbut may be I'm wrong22:39
wikingit's really not nice for the minorities there (kosovoar people)22:39
wikingbecause of the past stories... oppression by the serbs etc22:39
wikingso i completely understand that part22:39
wikingit's just funny when people from around the world who have no idea about anything22:40
wikingtry to fix it22:40
wikingbut when you look at their 'own mess' it's even worse in a way22:40
blackburnso a lot of my friends (me as well) think 'kosovo je serbia' :)22:40
wikinglike the guy who sketched up the 'solution' for kosovo is from finland22:40
blackburnfunny thing I don't think chechnya should be included22:41
blackburninto russia22:41
wikingand for instance the situation with russians in finland (on the border)22:41
wikingit's like wtf22:41
wikingi mean finnish people just amazingly hating russians... :(22:41
wikinghehehe yeah you have some troubles of your own as well22:42
blackburnfor what?22:42
wikingwell i don't know22:42
wikingit's just something from the past22:42
blackburndo you know how they 'solved' chechnya problem?22:42
blackburnnjet molotoff hah22:42
wikingi mean their hate is irrational (finnish)22:43
wikingthey hate russians because they tried to invade finland couple of times22:43
blackburnhah yeah22:43
wikingbut they have almost no real problems with the swedes22:43
wikingwho kind of like ruled them for 100+ years22:43
wikingso it's amazing how unbalanced that shit is22:44
blackburnI know no one hating finnish :)22:44
wikingbut it's all the same with those countries there in the baltics22:44
blackburnestonia/latvia/lithuania same22:44
blackburnthey hate their soviet legacy22:44
wikingbut it's part of their culture22:44
wikingand identity22:44
wikingso funny to hate something that is part of u22:45
blackburnbtw currently we don't like soviet state of mind as well22:45
wikingi don't know which was better22:45
wikingi mean don't get me wrong22:45
wikingi don't know that much of current state of russia22:45
wikingbut the thing with putin and yelcin22:45
blackburnso the situation with putin clearly describes what I mean22:46
blackburnpeople here want vozhd22:46
blackburnwho will rule them22:46
wikingyeah i kind of like sensed that one... that in russia some people just want a big leader22:46
blackburn30% do22:46
wikingsomething like stalin22:46
blackburn+ some falsification22:46
wikingor breznyev :P22:46
blackburnand here we go, putin again22:47
wikingbut that's amazing22:47
wikingi mean the whole thing around putin22:47
wikingthe oligarch22:47
wikingit's like a big fucking maffia22:47
blackburnexactly it is22:47
wikingespecially with gazprom22:47
blackburnsome day he will be judged22:47
blackburnparticularly yukos as well22:48
wikingi mean on the other hand if u look what was happening with yelcin...22:48
blackburnyeltsin was worse for sure22:48
wikingi mean that was amazing how the things gone really bad with yeltsin22:48
wikingeverything started to get wasted...22:48
blackburnbut there was a big bonus for russia22:49
blackburncost of oil22:49
wikingheheh yeah22:49
blackburnit impacted everything22:49
blackburnif there was 30$ for barrel - there would be no way for such 'great putin'22:49
blackburnwould be only wastelands here :)22:49
blackburnwiking: do you know how they trying to calm down people here?22:50
blackburnthere were protests after elections22:51
blackburna lot of22:51
wikingread about those22:51
blackburnthey just grow hate to US22:51
blackburnthey say US wants to do revolution here22:51
wikingno way!22:51
blackburnyes, they say we don't want another lebanon here22:51
blackburnor siria22:51
blackburnor egypt22:51
blackburnit works for not-too-smart-people22:52
blackburnbut more educated city people usually just laughs at it22:52
wikingsorry but i gotta run again... would love to continue this conversation some time soon22:52
blackburnaha okay :)22:53
wikingbut yeah fuck putin :P22:53
blackburnwas great to talk to you22:53
wikingyeah you too!22:53
blackburnsee you22:53
n4nd0blackburn: should it be better if I increase the cache-size parameter?22:57
n4nd0right now is to 40, and it has been training the svm for a long long while22:57
blackburnn4nd0: yes, should be22:58
blackburnthat's crazy :)22:58
n4nd0yeah I know :-P22:58
n4nd0but it is also because the size of the images changed when I tried with some fancy haar features22:59
n4nd0I am going to come back to the 19x19 and increase the size cache22:59
n4nd0which one is a good value for it?22:59
blackburnit is size of cache in mb22:59
blackburnyou may use any that fits into your memory22:59
n4nd0and for the width in the Gaussian?23:02
n4nd0an approximate value that should go good?23:02
n4nd0is there any heuristic or thumb rule to use?23:02
blackburnwell not really23:03
blackburnit should be very small and very large23:03
blackburnI don't know any good23:03
n4nd0I will try with 20 then23:05
n4nd0blackburn: nothing good with the Gaussian kernel :(23:34
n4nd0by the way23:48
n4nd0I don't think I got clearly the idea behind the two parameters for the kernel constructors23:49
n4nd0I mean the ones that are called23:49
n4nd0CDotFeatures * l, CDotFeatures * r23:49
n4nd0so far I am using the same for both23:49
n4nd0feats_train, feats_train23:49
n4nd0but feels weird to do it that way23:50
n4nd0blackburn: should they be different things?23:53
blackburnn4nd0: no23:53
blackburnok when you train classifier23:53
blackburnyou need k_ij between train features and train features23:54
blackburnbut when you classify23:54
blackburnyou need kernel values between train features and test features23:54
blackburnthat thing is going on when you call apply23:54
blackburnit inits kernel with23:54
blackburnfeats_train, feats_test23:55
blackburnand when do things according alphas/support vectors23:55
n4nd0actually I am not changing anything new when I call apply23:56
blackburnapply does23:56
n4nd0I do sth like23:56
n4nd0kernel        = GaussianKernel(feats_train, feats_train, width, size_cache)23:57
n4nd0svm = LibSVM(C, kernel, labels_train)23:57
n4nd0output  = svm.apply(feats_test)23:57
blackburnsvm.apply() changes kernel23:57
blackburnit inits kernel with feats train and feats test23:57
blackburnsvm.apply(feats_train) I mean23:57
n4nd0so there is nothing I should change?23:58
n4nd0is it done automatically?23:58
blackburnit is the same if you23:58
blackburnoutput = svm.apply()23:58
blackburnwithout anytihng in apply()23:59
n4nd0cool I get it23:59
blackburnnot very clear design here23:59
n4nd0I have to go now for a while23:59
n4nd0be back later23:59
blackburnbut no idea how to do it flexible and in better way23:59
--- Log closed Sun Feb 19 00:00:19 2012