Open in new window / Try shogun cloud
--- Log opened Wed Jun 06 00:00:41 2012
-!- heiko [~heiko@host86-179-59-69.range86-179.btcentralplus.com] has quit [Ping timeout: 256 seconds]01:07
-!- blackburn [d5578aee@gateway/web/freenode/ip.213.87.138.238] has quit [Ping timeout: 245 seconds]02:39
-!- romi_ [~mizobe@187.66.121.115] has quit [Quit: Leaving]04:36
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Read error: Operation timed out]05:47
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]06:07
-!- wiking [~wiking@78-23-189-112.access.telenet.be] has joined #shogun06:13
-!- wiking [~wiking@78-23-189-112.access.telenet.be] has quit [Changing host]06:13
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun06:13
-!- uricamic [~uricamic@2001:718:2:1634:29b5:2f5b:6ebd:d1b0] has joined #shogun09:01
-!- gsomix [~gsomix@109.169.142.23] has joined #shogun10:01
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]10:05
gsomixhi10:08
-!- wiking [~wiking@we02c096.ugent.be] has joined #shogun10:34
-!- wiking [~wiking@we02c096.ugent.be] has quit [Changing host]10:34
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun10:34
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun11:08
-!- gsomix [~gsomix@109.169.142.23] has quit [Quit: Ex-Chat]11:18
-!- gsomix [~gsomix@109.169.142.23] has joined #shogun11:20
-!- wiking [~wiking@huwico/staff/wiking] has quit [Ping timeout: 265 seconds]11:52
-!- alexlovesdata [82955843@gateway/web/freenode/ip.130.149.88.67] has joined #shogun12:00
alexlovesdatamay I ask: who is zxtx, naywhayare, and the CIA agent ?12:07
alexlovesdatawith respect to the others I have an idea who they are12:08
zxtxfan of the software12:14
zxtxworking on a patch to get pegasos into the repo12:14
alexlovesdataahh thx!12:15
gsomixalexlovesdata, CIA-9 is github bot.12:15
alexlovesdatathx12:17
-!- wiking [~wiking@78-23-189-112.access.telenet.be] has joined #shogun12:23
-!- wiking [~wiking@78-23-189-112.access.telenet.be] has quit [Changing host]12:23
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun12:23
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Quit: leaving]12:58
-!- alexlovesdata [82955843@gateway/web/freenode/ip.130.149.88.67] has quit [Ping timeout: 245 seconds]13:11
-!- alexlovesdata [82955843@gateway/web/freenode/ip.130.149.88.67] has joined #shogun15:26
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has joined #shogun16:05
alexlovesdataviking?16:11
wikingalexlovesdata: yeps here16:25
alexlovesdatanice16:26
alexlovesdataok so what you would like to do next?16:26
wikingalexlovesdata: so yeah as i was writing to you earlier the very short plan for the latent extension of shogun would be to get a simple -1,1 labelled data and solve for that the optimization problem. currently i have a dataset with mammal's in it. and the 'task' is basically to label the image (i.e. give the right mammal present in the image) and give a bounding box for it16:27
alexlovesdataso joint object detection and classification16:27
wikingso this is a very typical example for object recognition in images, i.e. your h would be something like (x,y) and (w, h)16:28
alexlovesdatacaltech256 animals? a dataset which I had used16:28
alexlovesdatathen yuo could cite me :)16:28
wikinghheheheh :)16:28
alexlovesdataunofficial Gsoc rule: cite your mentor, joke aside:16:28
wikingimho there's only 6 different mammals in this dataset16:28
alexlovesdatacaltech256 animals has 52 or so16:29
wikingi cannot recall now which dataset is this16:29
alexlovesdataexcept for the fantasy animals like minotaur16:29
wikingbut then again it's small and simple...16:29
wikingand the features are already ready16:29
alexlovesdatathats not important16:29
wikingso yeah anyhow... i was trying now that example to get working16:29
alexlovesdataso you want to implement the felzenszwalb style latent svm>16:30
wikingyes16:30
alexlovesdataso you want to implement the felzenszwalb style latent svm?16:30
wikingso first that one16:30
wikingand after that try to work on a SO latent svm16:30
wikingas soon as i can start using the solver of n4nd0 (fernando)16:31
alexlovesdataand psi(x,h) would be = ? HoG feature over the box?16:31
wikingyep16:31
wikingso now i have like n hog for each image16:31
alexlovesdataok, sounds easy16:31
alexlovesdatamy suggestion would be to get a base latentpsi class and then derive your special class from it\16:32
alexlovesdatawhich has its own argmax16:33
alexlovesdataand its own way to get the psi(x,h)16:33
wikingah so that one can already use this 'example' for other object recognition16:33
alexlovesdatawhat do you mean by your last question?16:34
wikingso i mean that we have this derived class16:34
alexlovesdatayes16:34
wikingwhich users can use out of box16:34
wikingif they wanna have an object detector16:34
wikingfor exmaple..16:35
wikingok16:35
alexlovesdatayes, good idea16:35
wikingthat should be fine, although i have to discuss about this with blackburn (sergey)16:35
wikingsince i think it should be part of the library itself and not the example part in the repository16:36
wikingbut this is some minor thing16:36
alexlovesdatawhat to discuss?16:36
wikingwell where exactly to store the code for this 'example'16:36
wikinganyhow i was just wondering if there's such easy example for latent svm16:36
wikingthat would be a usual use case16:36
wikingas it would be good to have 2-3 use cases for latent svm implemented in the library16:37
alexlovesdatain examples/someinterface ?16:41
alexlovesdataexamples/undocumented/*16:41
alexlovesdataor do you mean C++ code?16:41
alexlovesdataC++ code you can have as test method even16:42
wikingalexlovesdata: i mean that i think these basic 'examples' should actually be really part of the library itself. so that one could just basically include in his own code an ObjectDetector.h or something which is basically a latent svm based object recognizer...16:43
alexlovesdatayes, this is ok16:44
alexlovesdatayou have a base class and a derived example16:44
wikingyep16:44
wikingbut i need to talk about this with blackburn... how exactly we should do this16:44
wikingi mean where to put the actual code/header...16:44
alexlovesdataand then some code for interfaces  (python whatever)16:45
alexlovesdatayes put it into shogun main16:45
wikingyeah the modular interfaces will be the last step ...16:45
wikingso first only c++ and then when it all works fine i'll do the modular interfaces for python etc..16:45
alexlovesdatabecause it is a usable piece of code16:45
alexlovesdatamy question would be16:45
alexlovesdatawill you also implement the mining of hard negatives16:47
wikingaaaah16:48
wikingnot this week :D16:48
alexlovesdatano not this week16:48
alexlovesdataI wanted to say: it is NOT mandatory for latent SVM16:48
wikingbut yeah i was thinking about it16:48
alexlovesdataI think it is not your core duty to implement Felszenszwalb in all details, ok?16:49
alexlovesdataso if you do binary latent SVM fine16:49
alexlovesdatadoing more like hard negatives mining is NOT mandatory.16:50
alexlovesdataeverything besides mining hard negatives is luxury ... for donald Trumps wife16:50
wikingyea but actually it would be great to have imho16:50
wikingand of course when SO is in a working shape, it'd be great to have latent structural svm16:51
wikingwhat i want this week is really the simple solver i've mentioned earlier.... based on ocas16:53
alexlovesdatathats fine!16:53
alexlovesdataand pls do not waste time on more than mining hard negatives16:54
wiking:>16:55
wikingwill try :)16:56
alexlovesdatahmm, should we discuss nandos struct while he is not in chat?16:57
alexlovesdatabash it and talk about our wishes :) ?16:58
wiking:D17:00
alexlovesdatahttps://github.com/iglesias/shogun/tree/master/src/shogun/so17:00
wikingahhahaha17:00
alexlovesdatabecause if we discuss this is three weeks it might be too late17:00
alexlovesdataso now is time for wishing what we want17:00
wikingyeah i've already told him 2 weeks ago17:01
wikingw17:01
wikingwhat i want :D17:01
alexlovesdataany desired changes which you could tell me ?17:01
alexlovesdataso that I know what you want, too :D ?17:01
wikingwell i think the problem here will be17:01
wikingthat i'll have basically 2 base classes17:02
wiking1) a base class latent svm solver with -1,1 labelling17:02
wiking2) same but with structured labelling17:02
wikingi don't see it being able to cover by only 1 base class17:02
wikingso lets say i'll have something like: LatentLinearMachine and LatentStructuredLinearMachine17:03
alexlovesdatadoes that affect nandos framework? because for -1,+1 labels you could use ocas and fine17:04
alexlovesdataam i wrong?17:04
alexlovesdatawait I get myself a coffee for 3 minutes17:05
wikingno you are wrong17:05
wikingok no worries17:05
wikingi'll write the rest here in the meanwhile... so afaik LatentStructuredLinearMachine can be derived from CLinearStructuredOutputMachine17:05
wikingand that would be the latent s-svm solver17:06
-!- romi_ [~mizobe@187.66.121.115] has joined #shogun17:09
alexlovesdataif I am wrong then pls correct me17:12
alexlovesdataback from getting coffee17:12
wikingok17:13
-!- uricamic [~uricamic@2001:718:2:1634:29b5:2f5b:6ebd:d1b0] has quit [Quit: Leaving.]17:15
alexlovesdataafaik LatentStructuredLinearMachine can be derived from CLinearStructuredOutputMachine ...17:20
alexlovesdataI would say: you use it rather as a solver instead of deriving17:20
alexlovesdataso it would be a member17:20
alexlovesdataor called in train()17:20
alexlovesdataor called in train() only17:20
alexlovesdatathat might be easier than deriving it17:21
alexlovesdatathen you can use nandos stuff only as a solver and are free to design your own interfaces as you like them most17:21
wikingmmm17:21
wikingbut we'll have to change it17:22
wikingsince the PSI is different in case of a17:22
wikingLatentStructuredLinearMachine and CLinearStructuredOutputMachine17:22
wikingPSI(x,y,h) vs PSI(x,y)17:22
wikingso i wouldn't be able to use it directly17:23
alexlovesdataright17:23
alexlovesdatafor solver calls you know the h's already17:23
alexlovesdataso you could add to your member a getpsi_knowhiddenlabels17:24
alexlovesdatamethod17:24
alexlovesdatawhich inputs the right psi into nandos olver17:24
alexlovesdatawrong?17:24
wikingmmm17:25
alexlovesdataif you see a problem in it pls say so ... mistakes belong to me like rotten fruits to a market17:25
* wiking thinking17:25
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun17:26
wikingso this is the current implementation of an SO solver: https://github.com/iglesias/shogun/blob/master/src/shogun/so/VanillaStructuredOutputMachine.cpp17:27
wikingthis as is i won't be able to use as a solver directly in CLinearStructuredOutputMachine17:27
wikingi've meant: LatentStructuredLinearMachine17:27
wikingor i don't see it yet17:28
wikingmmm17:29
wikingok i'll be able17:29
-!- blackburn [d5578d64@gateway/web/freenode/ip.213.87.141.100] has joined #shogun17:29
wikingi'll only need to change the CStructuredModel17:29
blackburnhey17:29
blackburnI just read logs - wiking what is the code you want to discuss where to put it in?17:29
wikingbecause in https://github.com/iglesias/shogun/blob/master/src/shogun/so/VanillaStructuredOutputMachine.cpp#L34 the passed CFeatures* data would be actually the already calculated PSI(x,y,h)17:30
wikingblackburn: ok so let's say i have a basic latent svm solver, e.g. LatentLinearMachine17:30
blackburnright17:30
wikingthat needs a lot of parameters/functions implemented if you actually want to use it for an actual problem17:30
wikingso let's say u want to have an object detector in an image based on latent svm (typical use case)17:31
wikingthat would be something like ObjectDetector : public LatentLinearMachine17:31
n4nd0wiking: there have been some changes in Vanilla and other classes17:31
n4nd0wiking: check my branch so to see the latest ones17:31
wikingimho that class is so 'often' used that actually having it in shogun library would be good17:31
blackburnyes17:32
wikingn4nd0: https://github.com/iglesias/shogun/blob/master/src/shogun/so/VanillaStructuredOutputMachine.cpp#L3417:32
blackburnI don't mind to put it into shogun/latent17:32
wikingisn't this the latest?17:32
wikingblackburn: ok17:32
n4nd0wiking: no17:32
wikingn4nd0: ?17:32
wiking:D17:32
wikingwhere is it then? :D17:32
blackburnhowever there could be some issues17:33
blackburni.e if you want HoG there17:33
n4nd0wiking: in the branch17:33
n4nd0wiking: not in master17:33
wikingn4nd0: oh shit yeah sorry :DDD17:33
n4nd0no problem :D17:33
wikingblackburn: well that implementation is not dependent on HoG itself17:33
wikingso you could use other features17:33
n4nd0I have a bunch of new changes too I will push soon them to the branch17:34
alexlovesdataright, all what we need is a class for psi(x,y,h)17:34
blackburnwiking: then it could become pretty big17:34
alexlovesdatacan anyone give me the link to the relevanr branch?17:34
blackburnI don't mind to put it into applications as well17:34
wikingalexlovesdata: https://github.com/iglesias/shogun/tree/so/src/shogun/so17:34
wikingn4nd0: liked the other api better :)))17:37
wikingn4nd0: was more flexible :P17:37
n4nd0wiking: because of the function pointers?17:37
wikingnot just because of that17:37
alexlovesdataright, this api is a bit more special17:37
alexlovesdatabecause it separates the structures labels from the features17:38
alexlovesdatain SO this split is artificial17:38
alexlovesdataone works over Psi(x,y)17:38
n4nd0wiking, alexlovesdata : tell me what parts you don't like and we can adapt it17:38
alexlovesdatathis can be constructed from phi(x) and y17:38
alexlovesdatabut that is not necessary ...17:38
wikingn4nd0: i'm just checking https://github.com/iglesias/shogun/blob/so/src/shogun/so/StructuredModel.h17:39
wikingas basically that's the thing i'll have to modify17:39
wikingor create a derived class17:39
alexlovesdataif I am allowed to say something ...17:39
n4nd0what do you want to modify?17:39
n4nd0alexlovesdata: sure :)17:39
wikinggo ahead17:39
alexlovesdataif we would habe an alternate setter which allows to input the Psi(x,y) directly17:39
alexlovesdatawithout constructing them from struct labels and features17:40
alexlovesdatafor our stuff we will probably use a psi class to get this abstraction ... I do not require this17:40
n4nd0alexlovesdata: the part of Psi is quite undone so far17:40
alexlovesdatabut as thinking input it might be an idea17:40
alexlovesdatathe psi class has its own argmax17:41
alexlovesdatadepending on the structure17:41
alexlovesdataand can be initialized as one likes17:41
alexlovesdataeg explicitly by set structlables and set features17:41
alexlovesdatathe point is: the struct solver needs only psi(x,y) and information which x and which y belongs to each psi and the set of all possible ys and x's17:42
alexlovesdataeverything else is more specialized to some applications17:42
alexlovesdataam I wrong??17:42
alexlovesdatathats why I would like to have a way such that one can input the psis directly together with an argmax17:43
alexlovesdataand that was possible with the old C-style interface17:44
alexlovesdataby overriding the function pointer17:44
alexlovesdatabut you can do that with the new interface as well17:44
blackburnplease prefer interfaces17:44
n4nd0alexlovesdata: my idea is that Psi would be a class member of the StructuredModel17:44
alexlovesdata... with a different abstraction however17:44
blackburnpointers is more painful for modular interfaces17:44
n4nd0alexlovesdata: then you could have a set_psi there too17:45
alexlovesdataI agree blackburn17:45
alexlovesdataand its uglier17:45
blackburnwith brand new directors (TM) we can do some funky shit here17:45
alexlovesdatahowever even set_psi is bad when the psis are too big for memory17:45
alexlovesdataso it would be better to have a psi class which has its get_a_specific_psi member17:46
alexlovesdatabecause the solver could just use the getter to get the right psi17:46
alexlovesdataand its associated struct label17:46
alexlovesdataand no need for members like vector<fullpsis>17:47
n4nd0we should also take into account that sonney2k wants to have the joint features or psi with the idea of COFFIN17:47
n4nd0so I think that at the end we will use a class similar to CDotFeatures17:47
alexlovesdatawell, the psis need then a scalar prod (member of class) and a linadd ...17:47
alexlovesdatayep17:47
alexlovesdataI have not looked into CDoTFeatures but my idea behind it is: it needs then some getter for the psi(x_i,y_i)17:48
alexlovesdataand for its associated label index and feature index17:48
alexlovesdatathat should be enough for the solver17:49
alexlovesdataand a derivied class could implement a psi from Cfeatures and Cstructlabels ... as done now in the current code17:49
alexlovesdataso you would retain the current functionality17:50
alexlovesdatajust split the solver from getting the psis17:50
alexlovesdatathats my suggestion ... sorry for assholing around17:50
alexlovesdataso the structuredmodel would have a setpsiclass member or so17:51
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has quit [Quit: Page closed]17:51
alexlovesdatais that an idea?>17:52
wikingn4nd0: ping :)17:52
n4nd0alexlovesdata: so what you suggest is to have a psi_function that is a member of the model with a setter17:53
n4nd0or?17:53
alexlovesdatayes17:53
alexlovesdatabut psifunction is a class itself17:53
n4nd0wiking: I was answering, it takes some time to read and think :P17:53
wikingn4nd0: :>>> no worries17:53
n4nd0alexlovesdata: I agree with that suggestion, it's the idea I have17:54
alexlovesdataI get older, too ...17:54
wikingok17:54
n4nd0I have not so clear though what functionality we should provide in this base psi_function class17:54
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has joined #shogun17:54
alexlovesdatawhat the solver needs ...17:55
alexlovesdata1)accessing the i-th psi17:55
alexlovesdatagetting its index into training data (no x_i actually  necessary)17:55
alexlovesdatagetting its index into structlabels17:55
alexlovesdataif we assume that structlabels are discrete17:56
n4nd0where i-th represents both an index for a feature and another for a label?17:56
alexlovesdatathe index could be for a continuous case also a vectro of real numbers17:56
alexlovesdatabut in the simplest discrete case it is a long or a vector of long17:56
alexlovesdatafor prediction it needs the range of struct labels17:57
alexlovesdatanot for the solver: ways to construct these psis17:57
alexlovesdataindex for training data means the i for which trainning data point x_i17:58
alexlovesdataindex for structlabels means: which y was used17:59
alexlovesdataso these indices are two different things17:59
alexlovesdataso these two indices are two different things17:59
alexlovesdataI need a black tea for a moment18:00
alexlovesdataback18:03
n4nd0ok, tell me then18:04
alexlovesdatanando: you could check for a struct formulation what it needs besides the Psis18:04
alexlovesdatathen you know what members you will need for the psi class18:04
alexlovesdatait will also have its own argmax18:04
alexlovesdatabecause that depends on the structure of Psi18:04
n4nd0why?18:04
n4nd0I think that Psi and ArgMax should be different parts18:05
n4nd0I don't understand why the psi function have its own argmax18:05
alexlovesdatabecause  max_{y \in Y}  w*psi(x,y)18:06
alexlovesdatadepends on the structure of psi(x,y) and y18:06
alexlovesdatain the most generic case it would be searching all y's brute force18:06
alexlovesdatain more special cases you would search only some y's based on their structure18:07
alexlovesdataeg in computer vision only some bounding boxes close to a given one18:07
alexlovesdatay=bounding box params18:07
alexlovesdatathats why argmax would be a mamber of the psi class18:07
alexlovesdatawrong?18:07
n4nd0I understand your point18:08
n4nd0but thinking of the code, it looks to me kind of weird that the psi function has its own argmax18:08
n4nd0it is like, the psi function is computed independently of how the argmax is computed18:09
n4nd0then, why argmax should me a member of psi?18:09
alexlovesdatabecause I would say that computing the argmax depends on the structure of psi and y18:09
-!- romi_ [~mizobe@187.66.121.115] has quit [Ping timeout: 244 seconds]18:10
alexlovesdataI think a good starting point would be if you look into the SO formulation based on: how wouldwe start if we would load psi(x,y) from disk18:10
alexlovesdatawhat member would the psi class need18:11
n4nd0psi class needs labels and features18:11
alexlovesdataif you work directly with precomputed psis18:11
alexlovesdataonly labels, no x's18:11
n4nd0why not?18:11
alexlovesdatabecause in SO SVM you never use the X directly in optimization18:11
alexlovesdataonly Psi(x,y)18:12
alexlovesdataam I worng?18:12
alexlovesdataam I wrong?18:12
n4nd0ok, so you mean like we use a particular example of X (let's sat an x_i) but not the whole X?18:12
alexlovesdatawhat you need is only for psi(x_i,y_i) to remember y_i and the index i18:13
alexlovesdatano18:13
alexlovesdatapsi(x,y)=cos(x)*log(y)18:13
alexlovesdatayou will can work directly with the psis18:14
-!- heiko [~heiko@host86-179-192-248.range86-179.btcentralplus.com] has joined #shogun18:14
alexlovesdatayou will never need to know the value of x at no point in training or testing18:14
-!- romi_ [~mizobe@187.66.121.115] has joined #shogun18:14
n4nd0providing that psis are precomputed, or?18:15
alexlovesdataright!18:15
alexlovesdataand our derived psi class takes care of precomputing them in the style which you like18:15
alexlovesdatae.g. precomputing on the fly from x's and y's like you and nico are used to do18:15
n4nd0ok, I understand what you mean18:16
alexlovesdatabut we can also load psis from disk or an SSD on demand (thats why the getter for required single psi(x_i,y))18:16
n4nd0your point implies that m_features should not be in CStructuredModel, right?18:17
alexlovesdataright!18:17
alexlovesdatabecause you can still load them if necessary form SSD by get_your_psi18:17
alexlovesdataeven when they do not fit into your mem18:18
alexlovesdatathat would be scalable ;)18:18
-!- heiko1 [~heiko@host86-180-43-237.range86-180.btcentralplus.com] has joined #shogun18:18
alexlovesdataand a derived psi class could take care of that loading on demand or whatever18:18
alexlovesdatastop me if I am talking crap18:18
-!- heiko [~heiko@host86-179-192-248.range86-179.btcentralplus.com] has quit [Ping timeout: 256 seconds]18:18
alexlovesdatamay happen ;)18:18
n4nd0aham, so m_features could even dissapear from LinearSOMachine18:18
n4nd0alexlovesdata: haha ook :D18:19
alexlovesdataright because it asks the getter member to provide the next psi18:19
alexlovesdatayeah!18:19
n4nd0alexlovesdata: ok, so I can understand that but you have still to convince with Psi having the Argmax :)18:20
alexlovesdatahehehe18:20
alexlovesdataso the goal is t ocompute argmax_y w*psi(x_i,y) right?18:21
n4nd0yes18:21
alexlovesdataI look up something18:22
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has quit [Quit: Page closed]18:24
alexlovesdataso what happens if you have prior knowledge about how to compute the psis from x and y18:24
alexlovesdataencoded in your derived class18:24
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has joined #shogun18:24
alexlovesdatathen you can write a very efficient argmax18:24
alexlovesdataas memeber of this class18:24
alexlovesdatawhich exploits properties of the psi to compute argmax18:24
alexlovesdatato skip some candidate y's18:25
alexlovesdataexample y \in \RR^d18:25
alexlovesdatax in \RR^d18:25
n4nd0still, I see the relation in the other way; argmax has psi as a member18:26
CIA-9shogun: Heiko Strathmann master * rdaeea81 / examples/undocumented/libshogun/statistics.cpp : put different values to examples - http://git.io/r9gYDA18:26
CIA-9shogun: Heiko Strathmann master * r724eb3b / examples/undocumented/libshogun/statistics.cpp : Merge pull request #570 from karlnapf/master - http://git.io/CqY94g18:26
n4nd0alexlovesdata: would that fit for what you are saying?18:27
n4nd0I think it would18:27
alexlovesdatapsi(x,y)=cos(sum(x))*y18:27
alexlovesdatapsi is one d18:27
alexlovesdatathen for w>0 and cos(sum(x))>0 you can skip all negative y's18:28
alexlovesdatathe argmax is a function ...18:28
alexlovesdatayou want to make it a class?18:28
n4nd0yes18:29
n4nd0it's already a class in the last version of the code18:29
n4nd0we cannot afford to use function pointers18:29
alexlovesdataor better psi(x,y)=cos(sum(x))*y*difficultcomplexbutpositivefunctionof(x,y)18:29
alexlovesdatathen an efficient argmax can just look at the signs of the first two terms18:30
alexlovesdataand skip the difficultcomplexbutpositivefunctionof(x,y)18:30
alexlovesdataif psi is one dimensional18:30
alexlovesdatadoes that serve as an example18:31
alexlovesdatamy argument for making psi a member is that it needs only the w and knowledge about the structure of psi and y18:31
alexlovesdatathis is already present in the psi class18:32
n4nd0ok18:32
n4nd0but thinking of argmax as a class too18:32
alexlovesdatawhich is no contradiction18:32
alexlovesdatabecause a derived class could call a member argmax, right?18:32
n4nd0your idea should fit also if psi IS the member of argmax18:32
n4nd0alexlovesdata: yes18:33
alexlovesdatayour idea should fit also if psi IS the member of argmax: yes18:33
alexlovesdataI agree18:33
-!- blackburn1 [~blackburn@188.168.3.9] has joined #shogun18:33
alexlovesdatahmm, C++ has no real pope for it, what a pity18:33
n4nd0alexlovesdata: for what?18:34
alexlovesdatafor being an instance which tells us what to choose and can never make a mistake  :)18:35
alexlovesdatawith the psi being member of the argmax you would call then argmax->psiclass->getnextpsi in optimization18:36
alexlovesdatatoget psi(xi,yi)18:36
alexlovesdataalso possible ...18:36
n4nd0it is because IMHO to have argmax inside psi is not intuitive18:37
alexlovesdataI would be tempted to ask why not intuitive18:41
alexlovesdatabut I do not want to go on your nerves18:41
n4nd0haha18:41
n4nd0no problem18:41
alexlovesdatayou can tell nico to berate me fofr today :)18:41
n4nd0because Psi doesn't need to know anything about the argmax in order to do its task18:41
alexlovesdataright18:42
alexlovesdatabut sometimes the argmax can use specialized information about the psi for computation18:42
alexlovesdataand I think technically it would not make the argmax more special by putting into the psi, right?18:43
n4nd0that is still in the direction of argmax needs psi18:43
n4nd0but not psi needs argmax :D18:43
alexlovesdataright18:43
alexlovesdatabut psi also needs no getter18:43
alexlovesdatabut the getter needs psi18:44
n4nd0?18:44
alexlovesdatathats why the getter which delivers the i-th psi is a member18:44
alexlovesdatacould be that we are stuck in a question which needs a pope ... :)18:44
n4nd0yeah18:45
alexlovesdataor you do as you prefer as long as we can implement a specialized argmax for special psi18:45
n4nd0it feels better if we all agree18:46
alexlovesdataI would attach functions to the data classes ... but I will not require that from another ... because that is a style matter18:46
alexlovesdata(a papal matter)18:46
blackburn1alexlovesdata: I am curious - can one use latent svm with not only bounding box but some transformations like rotation or perspective?18:47
alexlovesdataI can agree on anything which allows users to implement specialized argmaxes and psi-getters18:47
alexlovesdataI think for general users yes18:47
alexlovesdatathats why I want an abstract psi18:47
alexlovesdataso that any guy who needs something more weird can program it18:48
n4nd0yes, that's important too18:48
alexlovesdataso that construction of the actual psi cn be done outside the solver code18:50
alexlovesdataexcept for where it is unavoidable18:51
alexlovesdatamaybe I forgot the important point:18:52
alexlovesdatawhich I had mentioned just now ... splitting solving the problem from constructing the psi18:52
alexlovesdatabecause with the current interface that is inside the solver18:53
n4nd0I am sorry but I don't understand18:54
alexlovesdata void set_labels(CStructuredLabels* labs);  /** set features * * @param feats features */ void set_features(CFeatures* feats);  /** computes \f$ \Psi(\bf{x}, \bf{y}) \f$ */ SGVector< float64_t > compute_joint_feature(int32_t feat_idx, int32_t lab_idx)18:55
alexlovesdata1. now we store the features and labels in memory (can be loaded on the fly ... your virt memory will like that :D )18:56
alexlovesdata2.  compute joint feature is now inside the solver class,18:56
alexlovesdatawith a getter it would be outside the solver algorithm18:56
alexlovesdatawith a psi class and a getter it would be outside the solver algorithm18:57
alexlovesdatano one understands me :'-((18:57
alexlovesdata:)18:57
alexlovesdataI need to check MKL regression .. wasbroken in 0.10.0 and 1.1.0 for matlab with custom kernels18:58
blackburn1alexlovesdata: are you the author of MKL in svmlight?18:59
alexlovesdatano18:59
alexlovesdatathat was marius kloft18:59
n4nd0alexlovesdata: it is not like the compute join feature is inside the solver18:59
blackburn1aham18:59
alexlovesdatabut I have a little bit insight in it18:59
blackburn1we had some issue there18:59
blackburn1with LINADD optimizations18:59
blackburn1basically it is broken19:00
alexlovesdataok, that part I never looked into19:00
n4nd0alexlovesdata: the idea at that moment was that the model has a in a member the joint feature function and provides this compute_joint_feature for the solver19:00
blackburn1ok19:00
n4nd0alexlovesdata: since the solver does not have a reference to the psi function directly but a reference to the model19:01
alexlovesdataI understand ... but that requires to store  CStructuredLabels* m_labels;  /** feature vectors */ CFeatures* m_features;19:01
alexlovesdatawith an external psi class this would be transparent19:01
alexlovesdataor do some complex hacks which insert artificially a psi19:02
alexlovesdataI am strongly for keeping that computation out of the solver and let the psi getter do that job19:02
alexlovesdatabecause then you can init the solver with a nando-style psi19:03
n4nd0alexlovesdata: would you mind to sketch in a class diagram or using gist how would you like it to be then?19:03
alexlovesdataor an object detection psi19:03
alexlovesdataor a custom psi19:04
alexlovesdataI am mathemtician19:04
wikingn4nd0: i can do that for ya19:04
n4nd0wiking: ok19:04
alexlovesdataI could write an example header, ok?, but I am not familiar with diagrams19:04
n4nd0alexlovesdata: ok19:05
wikingalexlovesdata: afaik i know what you'd like to do here so i'll try to sketch it up in gist19:05
wikingand let you and n4nd0 check it out19:05
alexlovesdatagreat! thank you!19:05
wikingnw19:05
alexlovesdatanw = ??19:06
wikingi'll post it on the mailing list19:06
wikingnw = no worries19:06
n4nd0nice19:06
alexlovesdatahi nando, I hope you can live with my blabla ;)19:08
wiking:D19:09
blackburn1n4nd0: I failed with 'curl' word :)19:10
n4nd0alexlovesdata: sure no problem! it's good to talk and discuss19:10
n4nd0blackburn1: noooooo19:10
blackburn1I have absolutely no idea where to put any curl here :D19:11
n4nd0alexlovesdata: I did some modifications to a diagram I was using introducing today's conversation19:21
n4nd0alexlovesdata: can you take a look to it and tell me if it represents what you said?19:21
n4nd0http://dl.dropbox.com/u/11020840/shogun/diagram.pdf19:21
alexlovesdatathank you!19:21
n4nd0it is the left-most part19:21
alexlovesdataI take alook19:22
alexlovesdatathe arrow means derived class, right?19:24
alexlovesdatathe 45 degree rotated cube means class is member of another?19:25
alexlovesdataat the first sight looks nice19:25
alexlovesdatashould give us the possibility to do what we need ...19:25
alexlovesdatawiking: what do you think?19:25
wikingalexlovesdata: just checking19:26
n4nd0alexlovesdata: arrow derived class and the cube member yes19:26
wikingah yeah19:26
wikingone thing19:26
n4nd0I think it strictly means weak aggregation or something like that, but member is fine :D19:26
wikingget_psi (lab_idx, feat_idx)19:27
wikingok never mind19:27
wikingit's ok19:27
wikingi mean essentially it'd be the same idx or?19:27
alexlovesdataprobably compute_joint_feature would use get_psi again?19:27
wikingor can it be that you want get_psi(0,1) ?19:27
wikingok yeah you may actually want to do that... so ok19:28
alexlovesdataI think feat_idx refers to the index in x_i19:28
n4nd0alexlovesdata: yes, compute_joint_feature is get_psi19:28
alexlovesdatalab_idx refers to the index into all possibly Ys19:28
n4nd0not into all possibly Ys19:28
alexlovesdataright? wrong?19:28
wikingalexlovesdata: well afaik you cannot have all the possible Ys19:28
wikingalexlovesdata: only the ones that are present ;)19:29
n4nd0but the index into the Ys we got in training data19:29
n4nd0like the true Ys19:29
wiking^ what n4nd0 means here ;)19:29
alexlovesdatayoue are right, I agree19:29
alexlovesdatabut I think lab_idx would be not the index into y_i , right?19:29
alexlovesdataotherwise I should go to bed soon19:29
n4nd0yes, it is19:29
n4nd0but don't go to bed :P19:30
alexlovesdataI need to be awake until 1:30 am :(19:31
n4nd0why did you say that lab_idx is not the i in y_i?19:31
alexlovesdataso it is the index in y_i  ?19:31
wikingalexlovesdata: not in but of19:32
wikingso i guess y_{lab_idx}19:32
alexlovesdataok19:32
wikingit actually refers to a given y in the set19:32
alexlovesdataso lab_idx is NOT the training data index, but the index into all available values for Y ?19:33
alexlovesdatathat I would think19:33
alexlovesdataI get myself a black tea again19:33
wikingalexlovesdata: actually it is19:33
wikingindex in the training data index19:33
n4nd0yes, in the training data index19:34
wikingor i understand it as such19:34
n4nd0me too19:34
wikingok if n4nd0 as well then we are good ;)19:34
n4nd0I mean, for me the index into all available values for Y makes no sense19:34
wikingas it can be an infinite set19:34
-!- romi_ [~mizobe@187.66.121.115] has quit [Ping timeout: 260 seconds]19:34
n4nd0and there's not order defined there19:35
wikingcountable but not finite ;)19:35
alexlovesdataok then I do not get it currently19:38
alexlovesdatanever mind ... temporary confusion19:39
-!- romi_ [~mizobe@187.66.121.115] has joined #shogun19:42
alexlovesdatanice that we could agree today ...19:47
blackburn1hard discussion tonight19:47
wiking\o/19:49
n4nd0:)19:49
gsomixn4nd0, hey19:49
n4nd0gsomix: hi19:50
gsomixcan you concretize about director classes that you wish?19:50
n4nd0yeah sure, I don't know if you read the conversation about it with sonney2k19:51
n4nd0basically is that I think it would be a good idea to have the argmax and the psi function of SO with director classes19:51
n4nd0so we can prototype in python and so on19:51
n4nd0wiking, alexlovesdata: do you think it would be good to have that?19:52
alexlovesdatayes, sounds practical :)19:52
wikingsecond that19:53
alexlovesdatamaybe we should prioritize the wishes of directors among all objects ... what people use most??19:53
n4nd0gsomix: what do you think?19:53
n4nd0I have no idea how hard or how much time does it take to do these director classes19:54
n4nd0gsomix: you have done any already right?19:54
gsomixn4nd0, I just can tell you I'll do whatever you want :)19:54
gsomixn4nd0, e.g. DirectorDistance, last day19:55
n4nd0ok19:57
n4nd0so let's wait some days so we can say that the design of these classes is better established and I will tell you about it19:58
gsomixn4nd0, ok19:58
n4nd0I am going now, talk to you later guys!19:59
-!- romi_ [~mizobe@187.66.121.115] has quit [Ping timeout: 252 seconds]19:59
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Read error: Operation timed out]20:02
alexlovesdataanobody has an idea how the regression label class is initialized in matlab?20:05
alexlovesdatablackburn: what do the scores for the german sign recognition data?20:08
blackburn1alexlovesdata: ah I stopped at 97.84%20:08
blackburn1not enough time to try different hogs, etc :(20:09
alexlovesdataseems to be +0.4 or so20:09
alexlovesdataok20:09
blackburn1yes colors are great20:09
blackburn1for curiosity I tried to put test images to train ones20:09
blackburn199.84% :D20:09
alexlovesdatamaybe they did???20:10
alexlovesdatahmm, the winner got pretty much like 99.8420:10
blackburn1winner got 99.46%20:10
blackburn1well they use LeNet20:10
alexlovesdataI mean there is this transductive stuff20:10
blackburn1yes I just wanted to say I could try transductive20:11
alexlovesdata:D20:11
blackburn1but unfortunately I have to get all things done in 4 hours20:11
alexlovesdatawith a neural net it could be implicitly transductive20:11
blackburn1so I just claim my 97.82 w/o any cheating20:11
alexlovesdatawell but for a thesis you can point out that there are some hard cases missing in training and adding them ... blablabla20:12
blackburn1haha20:12
-!- romi_ [~mizobe@187.66.121.115] has joined #shogun20:13
blackburn1alexlovesdata: I have other thing that makes me able to claim I've got 100% accuracy :D20:14
blackburn1rejects20:14
alexlovesdataalso nice ... in particular if you can reject automatically20:14
blackburn1alexlovesdata: I just threshold outputs20:14
blackburn1with high threshold I get no errors *at all*20:15
alexlovesdatathen you could grab similar data from the web, add it to training and get 99.99%20:15
blackburn1however 50% are rejected hen20:15
alexlovesdatahaha20:15
alexlovesdatathats the ML trick of the day20:15
blackburn1:D20:15
blackburn1alexlovesdata: too bad I spent too much time on fun instead of training efficient classifier :D20:19
alexlovesdataaren't we all doing it similar? :D20:20
blackburn1i.e. I managed to cite Karl Popper but did not do overlapping HoG20:20
blackburn1because it was funnier20:20
alexlovesdataand noetherian rings, too?20:21
blackburn1no, it is impossible I think20:21
blackburn1alexlovesdata: okay lesser cheating - added to trainset images from testset with errors :D20:26
blackburn1only 183 images actually20:26
alexlovesdatabut then did images from the test set improve which have not been added, as well?20:26
blackburn1I am not sure I understand that20:27
blackburn1what do you mean? :)20:28
alexlovesdataso you added images from testset to trainset, this implies that all added images will be classified well (because SVMs overfit terribly)20:31
blackburn1hmm yes probably20:31
alexlovesdatabut were there images which you did not add to the trainset, which have been classified wrongly before and then have been classified correctly after20:31
alexlovesdatai.e. these images would have profited from improved generalization by adding the 183 other20:32
alexlovesdataand not from mere overfitting20:32
blackburn1no I added all images there I had errors20:33
blackburn1results will be in a min I think20:33
blackburn1alexlovesdata: does SVM really overfit terribly?20:34
alexlovesdatayea, usually AUC=100 on training data20:35
blackburn1alexlovesdata: what about NNs then?20:35
alexlovesdataI have no idea for neural nets ...20:37
blackburn1alexlovesdata: in my world it was thought that NNs overfit and SVMs are better because they do not overfit so much20:38
blackburn1alexlovesdata: wow adding 183 images lead to 99.69%20:38
blackburn1these 183 images would make me a winner of a contest20:39
blackburn1huuh20:39
alexlovesdataon training data with what I work get AUC=10020:39
alexlovesdataerror=020:39
blackburn1alexlovesdata: I feel confused because you make me feel all the capacity control, blabla is useless stuff :)20:40
blackburn1and probably all that stuff is useless for real20:40
alexlovesdatawell capacity control is made for getting a reasonable error on testing data (cross-validation)20:40
alexlovesdatait is not made to tune error rate on train set = error rate on test set20:41
blackburn1isn't that for some generalization ability?20:42
blackburn1I mean I thought max margin is the point of good generalization20:42
blackburn1am I wrong?20:42
blackburn1alexlovesdata: I remember you are a big fan of our 'president' - let you become a fan of our parliament http://cs304702.userapi.com/v304702563/1de3/Spom2VHmg4o.jpg20:47
blackburn1oops not 183 but 27620:49
alexlovesdatayes, it is for generalization but still you get zero error at trainingdata21:04
alexlovesdatawhat does the image mean?? - you can explain me tomorrow, I go home, 12 hours is enough!21:05
blackburn1alexlovesdata: image? one I send?21:05
blackburn1well deputy playing teddy bear21:06
blackburn1:D21:06
-!- alexlovesdata [82955843@gateway/web/freenode/ip.130.149.88.67] has quit [Ping timeout: 245 seconds]21:09
-!- blackburn [d5578d64@gateway/web/freenode/ip.213.87.141.100] has quit [Ping timeout: 245 seconds]21:47
-!- puffin444 [472e31fb@gateway/web/freenode/ip.71.46.49.251] has quit [Ping timeout: 245 seconds]21:49
@sonney2kblackburn1, SVMs don't overfit and your SVM giving that high accuracy on test data certainly does not22:00
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun22:04
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]22:19
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun22:23
gsomixsonney2k, hey22:24
@sonney2kgsomix, ho :)22:24
gsomixEuclidianDistance vs DirectorDistance 10:37 (in seconds)22:25
@sonney2kgsomix, how is it going?22:25
gsomixsonney2k, working, programming, building.22:25
@sonney2kgsomix, you mean again 10 times slower right?22:25
@sonney2ksounds good22:25
gsomixsonney2k, nope. 3.7 times22:26
@sonney2kahh ok - I guess you have the non-optimized atlas version like blackburn122:27
@sonney2kgsomix, you could do a director for a general kernel machine22:28
@sonney2knext I mean22:28
gsomixsonney2k, ok22:28
@sonney2ksame with general linearmachine22:28
@sonney2kmaybe these two should be next22:29
@sonney2kI think the only important thing to overload here is the train method22:29
@sonney2kthat's about it22:29
n4nd0sonney2k: by the way, did you read part of conversation wiking, alexander and I had before?23:07
-!- wiking_ [~wiking@huwico/staff/wiking] has joined #shogun23:20
-!- wiking [~wiking@huwico/staff/wiking] has quit [Ping timeout: 240 seconds]23:21
-!- wiking_ is now known as wiking23:22
@sonney2kn4nd0, superficially yes23:38
n4nd0sonney2k: ok, just in case you had a comment or whether psi should be member of argmax or vice versa :D23:48
blackburn1sonney2k: hmm do not overfit at all?23:48
n4nd0that has the hop topic23:48
blackburn1you all confusing me with contradictive claims :D23:48
gsomixgood night guys23:54
gsomixah, btw, google summer of building report http://instagr.am/p/Lf3WXUMs4H/23:55
gsomixfirst wall23:55
gsomixhehe23:55
gsomix.___.23:55
--- Log closed Thu Jun 07 00:00:41 2012