Open in new window / Try shogun cloud
--- Log opened Wed Feb 15 00:00:19 2012
n4nd0blackburn: hi!01:13
blackburnn4nd0: hi01:22
n4nd0blackburn: how is it going?01:25
n4nd0blackburn: I tested the classifier splitting the data in train and test and I am quite surprised with the results01:25
blackburnfine, but get no sleep today preparing to some talk at student's conference :)01:25
blackburntell me01:25
n4nd0blackburn: they were much better than I expected01:26
blackburnwith which classifier?01:26
n4nd0blackburn: SVMLib01:26
blackburnlibsvm I guess01:26
blackburnI see01:26
n4nd0blackburn: yeah, libsvm sorry01:27
n4nd0blackburn: so the accuracy is 0.99333836098501:27
n4nd0blackburn: what is much better than what I expected using just an svm and the image pixels as features01:27
blackburnlooks pretty unreal for faces hmm01:27
n4nd0blackburn: exactly01:28
blackburnwhat are the features?01:28
n4nd0blackburn: pixel values01:28
blackburnis it CBCL face database?01:28
n4nd0blackburn: black and white images, normalized to mean 0 and std 1, but pixel values indeed01:29
n4nd0blackburn: is the database I used for a course at the university, let me check if it was taken from a known database01:29
blackburnI recalled it has 19x19 images too01:30
n4nd0blackburn: yes01:30
blackburnthen why did you do splitting?01:30
blackburnI mean there is test set01:31
n4nd0so I don't use all the data for training but I have part of it to test as well01:31
n4nd0I don't know if I got what you meant01:32
blackburni mean if we are talking about the same dataset01:32
blackburnthere is a train and test sets01:32
blackburnso you don't have to split I guess01:33
n4nd0aha, I see01:34
n4nd0I will try with that database to see what I get01:34
blackburnbtw it is important to use whole training set01:34
blackburnor even with virtual images01:34
blackburnit is a common practice01:34
n4nd0virtual images?01:34
n4nd0artificially generated you mean?01:35
blackburni.e. shifted faces or noised01:35
blackburnthere is a citation here, they used some crazy method for features01:35
n4nd0I remember that was pretty useful to train the cascade with boosting01:36
blackburnyeah boosting works well in practice afaik01:36
blackburnI am not a big fan of it though ;)01:36
n4nd0I believe it is called bootstraping01:36
blackburnor more generally ensembles01:37
n4nd0I guess that trained classifiers can be stored and retrieved from a program later right?01:37
blackburnhmm yes, using serialization techniques01:38
n4nd0like save them into a file and later load them into memory01:38
n4nd0does that work between interfaces?01:38
n4nd0e.g., I store sth trained with python and load it from octave01:38
blackburnshould work01:39
blackburnif you did not use pickle or so01:39
n4nd0don't really know what is pickle01:39
blackburnpickle is a serialization package for python01:40
blackburnthat enables to save/load objects01:40
n4nd0ah ok01:40
blackburnI am not sure there01:41
n4nd0so then there is not something in shogun that enables the serialization01:41
blackburnyou could try01:41
blackburnhmm that's pretty complex thing01:41
blackburnlet me try to describe01:41
blackburnwhile we have swig interfaces01:41
blackburnwe have some internal serialization part01:42
blackburnbased on C==01:42
blackburnand 'external' related to concrete interface like python or ruby01:42
blackburnso when pickle in python tries to save or load shogun object it uses that C++ serialization code01:43
n4nd0then it might be possible, just wondering anyway :)01:44
n4nd0blackburn: about gsoc, I want to start taking a closer look to possible projects I could apply for01:48
n4nd0haha good summary of a day01:48
blackburnmy current life remind me that video:
n4nd0blackburn: I think it is good if I start some coding related to it01:50
blackburnokay there would be definitely be structured output learning01:50
blackburnand gaussian processes01:50
blackburnthen one idea would be possibly related to ECOC and some label tree learning01:50
n4nd0are you planning to apply for one of those this year or will you continue with dimensionality reduction?01:51
blackburnah yes I am going to apply to multitask learning01:52
n4nd0I read today a bit about gaussian procceses and structured output learning, both look really interesting01:53
blackburnyeah promising edges01:54
n4nd0blackburn: so what do you think is a suitable approximation to get into one of those?01:55
blackburnI am not sure what you mean01:56
n4nd0blackburn: study a current implementation of it, like this one for GPs and later try to port it?01:56
blackburnhmm I would suggest you to take a look on scikits GPs01:56
blackburnI believe it would be more readable01:57
n4nd0all right, thanks :)01:57
n4nd0blackburn: so I meant for example, I start a documentation phase and once the coding can be started I have to tell around here that I am working on that and that's all?01:58
blackburnnot really01:58
blackburnare you about gsoc applying part?01:59
n4nd0right know I was about collaboration before gsoc01:59
blackburnwell it is not necessary to start right now02:00
n4nd0I guess that if I start with this during the next days I will be able to start working on it before gsoc has started02:00
blackburnyou don't have to hurry02:00
n4nd0aha ok02:00
blackburnif you finish things in may what would you do during summer? :)02:01
n4nd0haha I see02:01
n4nd0I assumed that could give better chances for the application to succeed02:02
blackburnsure but you may do some other things for now ;)02:02
blackburnfor example you'd be very welcome to integrate your face recognition things02:02
blackburnwe lack good -real- examples02:03
n4nd0ok :)02:03
n4nd0that's good then, I will continue with that and try to make a nice example :)02:04
n4nd0some doc about GPs02:04
blackburnI find pretty useful to implement things by myself in some scripting lang02:04
blackburnand then port to C++02:04
blackburnso if you are hungry to GPs it would be useful to implement it in octave/matlab/python/etc02:05
blackburnI believe it would take less than week to port things then02:05
n4nd0I have to get some sleep now02:07
blackburnand staying in touch increases your chances significantly :)02:07
n4nd0blackburn: :)02:07
blackburnthere are some guys promising some things to implement02:07
blackburnthey appear and disappear for weeks02:07
blackburnnot really good practice for me :)02:08
n4nd0well, thank you for the suggestions02:08
blackburnyou should talk to Soeren as well02:09
blackburnto get him to know you02:09
blackburnI am not the boss :)02:09
blackburnokay sleep well then :)02:09
n4nd0haha ok02:09
n4nd0good night02:09
blackburngood night02:09
-!- n4nd0 [] has quit [Quit: leaving]02:09
-!- naywhayare [] has joined #shogun03:40
-!- blackburn [~qdrgsm@] has left #shogun []03:47
-!- dfrx [] has joined #shogun04:16
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]05:17
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun08:10
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]09:49
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun09:49
-!- wiking [~wiking@huwico/staff/wiking] has quit [Ping timeout: 240 seconds]09:53
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun10:00
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]10:47
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun10:59
-!- dfrx [] has quit [Quit: Leaving.]11:57
-!- blackburn [5bdfb203@gateway/web/freenode/ip.] has joined #shogun14:04
blackburnwiking: JS was a little better than HIK ;)14:05
wikingit took this long?14:05
blackburnwiking: not really, just recalled14:06
blackburnbut anyway long, yes14:06
blackburn30K seconds for 2000 vectors14:07
blackburnlibsvm OvO14:07
blackburnnot a real-time system lol14:07
-!- n4nd0 [] has joined #shogun14:14
blackburnsonne|work: ???-??-??-???!14:42
blackburnunbelievable %$%$14:47
sonne|workblackburn: ?15:38
blackburnsonne|work: you don't answer mails and it is impossible to catch you!15:38
blackburnlooks like I ve been pinging you for last month :D15:40
blackburnwiking: are you willing to integrate this homogay kernel map from vlfeat?15:40
wikingblackburn: yep15:41
blackburnso I shouldn't, right?15:41
blackburnjust checking15:41
wikingblackburn: i'm just finishing up some other code, but if u guys say that you are willing to do the pull then i'll do it this week15:41
wikingi guess it should go within the preprocessing15:41
blackburnI'm not really in hurry with it15:42
wikingi mean preprocessor15:42
blackburnyes, looks like15:42
wikingok great15:42
blackburncould be converter as well..15:42
blackburnbut i guess preprocessor fits better15:42
blackburnbtw we already have similar thing here15:43
blackburnrandom gaussian fourier blabla15:43
-!- n4nd0 [] has quit [Ping timeout: 260 seconds]16:21
CIA-18shogun: Soeren Sonnenburg master * r0338587 / examples/undocumented/python_modular/ : remove unused gaussian kernel from example -
-!- blackburn [5bdfb203@gateway/web/freenode/ip.] has quit [Quit: Page closed]16:38
-!- wiking [~wiking@huwico/staff/wiking] has quit [Ping timeout: 245 seconds]17:33
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun18:13
-!- wiking [~wiking@huwico/staff/wiking] has quit [Remote host closed the connection]19:33
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun19:33
-!- n4nd0 [] has joined #shogun22:36
--- Log closed Thu Feb 16 00:00:19 2012