Open in new window / Try shogun cloud
--- Log opened Tue May 08 00:00:37 2012
wikingnow it's good... no leak with xval00:02
CIA-113shogun: Viktor Gal master * rc132e97 / src/shogun/machine/MulticlassMachine.cpp :00:02
CIA-113shogun: Fix mem-leak in MulticlassMachine00:02
CIA-113shogun: apply() had an extra SG_REF on the returned CLables -
CIA-113shogun: Soeren Sonnenburg master * r9d6bd98 / (35 files in 15 dirs): introduce ref counted SGMatrix -
wikingmmm classifier_multiclasslinearmachine.cpp is not leaking anymore00:03
@sonney2kthe madness continues00:08
PhilTilletmadness ? This is spartaaaaaa00:11
PhilTilletOh hi everybody :P00:11
PhilTilletI have found a magic paper for SVM on CUDA, which provides also GPL code !00:13
@sonney2kPhilTillet, hehe00:16
@sonney2kdid you try it yet?00:17
PhilTilletas soon as my robot competition is over i'll probably start from that, port that code to OpenCL and integrate it into Shogun (and also integrate the CUDA version)00:17
PhilTilletsonney2k, not yet, my laptop has some problems :p00:17
PhilTilletbut the benchmarks are encouraging :p00:17
PhilTilletand it's only one year old, so very recent00:18
@sonney2khmmh tiny data sets though00:18
@sonney2kwiking, yeah fixes welcome ... it will take another few days to fix them I guess00:19
PhilTilletwell, true, but bigger size should not cause any issue00:19
@sonney2kbut probably not as long as SGVector00:19
wiking"Google breached Oracle's Java copyright, US jury finds"00:19
wikingAPI is copyrightable.:DDDDD00:19
@sonney2ksparse* / stringlist shold be no problem00:19
wikingwelcome to hell :>00:20
@sonney2kok bed time00:25
wiking"Oracle's lawyers compared the creation of APIs to writing a piece of music, going further to say that API's are not just "ideas," but creative, copyrightable works that require significant expertise and time to develop"00:25
wikingwe are writing here music apparently00:25
PhilTilletgn sonney2k00:26
wikingholycow these lawyers are idiots00:26
blackburnI really hate oracle00:26
wikingit'll be really funny now00:27
blackburneverything is so cumbersome00:27
blackburnand a lot of lawyers00:27
wikingseeing all kinds of cases when they start to sue each other by using the same api00:27
wikingso i just wonder00:27
blackburnwiking: what is API they are so proud of?00:27
wikingif i can copyright00:27
wikingprint (string a)00:28
blackburnyeah that's crazy expertise-requiring pattern00:28
wikingbecause as soon as i can can acquire the copyright for thaaaat!!!00:28
wikingi'm gonna sue each and every motherfucker trying to print :D00:28
blackburnwiking: I believe it took a few years to come up with this api00:28
wikingblackburn: :>>>00:29
wikingi wonder if this would be the same00:29
wikingprint (string a)00:29
wikingprint (char* a)00:29
-!- karlnapf [] has quit [Quit: Leaving.]00:29
blackburnno char* goes for me00:29
blackburnI need money too :D00:29
blackburntotally different API00:29
blackburnwiking: you can take all the print API market00:30
blackburnsmall companies need they own print API00:30
PhilTilletI take the operator= market00:30
wikingi think after a while technological companies will have to move out of the USA00:30
wikingi mean like this there's not much thing u can actually use00:31
wikingwithout license00:31
blackburnhmm my brain is low-power00:33
wikingneed store_model_features nooow :D00:37
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]00:50
cronorcan somebody help me with combined kernels for MKL? If i append one kernel it works fine, if i add the same kernel again, it doesn't work anymore.00:52
cronorthis is my code:
cronorminimal working code00:55
-!- av3ngr [av3ngr@nat/redhat/x-ggqpgdgimwhbvflr] has joined #shogun00:57
-!- blackburn [~qdrgsm@] has left #shogun []01:13
-!- cronor [] has quit [Read error: Connection reset by peer]02:11
-!- vikram360 [~vikram360@] has joined #shogun02:47
-!- PhilTillet [] has quit [Remote host closed the connection]02:49
-!- vikram360 [~vikram360@] has quit [Ping timeout: 245 seconds]04:07
-!- puffin444 [62e3926e@gateway/web/freenode/ip.] has joined #shogun05:31
-!- wiking [] has joined #shogun06:18
-!- wiking [] has quit [Changing host]06:18
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun06:18
puffin444Hi wiking06:33
-!- gsomix [~gsomix@] has joined #shogun06:55
-!- gsomix [~gsomix@] has quit [Client Quit]06:57
-!- gsomix [~gsomix@] has joined #shogun06:57
-!- vikram360 [~vikram360@] has joined #shogun07:07
-!- sonney2k_ [] has joined #shogun07:12
-!- abn_ [av3ngr@nat/redhat/session] has joined #shogun07:14
-!- Netsplit *.net <-> *.split quits: @sonney2k, naywhayare, av3ngr, puffin44407:21
-!- Netsplit over, joins: naywhayare07:22
-!- abn_ [av3ngr@nat/redhat/session] has quit [Changing host]07:22
-!- abn_ [av3ngr@nat/redhat/x-sjdrwpxzkatjgopq] has joined #shogun07:22
-!- puffin444 [62e3926e@gateway/web/freenode/ip.] has joined #shogun07:24
-!- puffin444 [62e3926e@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]07:28
sonne|workwiking: the rationale is: when you return a newly created SGObject but don't need it internally you should *NOT* SG_REF it!08:42
* sonne|work sigh08:42
sonne|workthe biggest consumer of SGMatrix is SimpleFeatures08:42
sonne|workand that one doesn't internally use it yet08:43
sonne|workso it needs a BIG effort to convert it to use these things08:43
-!- cronor [] has joined #shogun08:57
wikingsonne|work: ok09:23
sonne|workwiking: btw shouldn't we rename CSimpleFeatures to CDenseFeatures09:24
sonne|workwe are breaking too much anyways so we can aswell fix legacy naming bugs09:25
wikingthat would be more appropriate09:25
-!- n4nd0 [] has joined #shogun09:32
wikingsonne|work: ideas for the store model features in case of kernelMC?09:39
sonne|workwiking: same way as in KernelMachine09:39
wikingwell yeah that's the problem09:39
sonne|workbut you should try to avoid storing them more than once09:40
wikingcurrently we don't store svs within the clas09:41
sonne|workwiking: but we should...09:42
sonne|workmore work but I guess pluskid can help here too09:42
wikingi'll see about it then09:43
sonne|workit will need some discussion how to do it most efficiently09:44
sonne|workas in no / little code duplication09:44
wikingand as i can see we can suppose that any kernelMC machine will have SVs09:44
wikingjust like in binary case09:45
sonne|workand no memory overhead09:45
sonne|workyeah but there are different schemes09:45
sonne|workone vs one09:45
sonne|workone vs rest09:45
sonne|workand general ECOC09:45
wikingmaybe then09:46
sonne|workso we need a way to store only the required SVs - but *once*09:46
wikingimplementing the data_lock mechanism would be faster? :)09:46
sonne|workdon't see how it is related09:46
wikingwell i want to have xval for kernelMC09:47
wikingthat's it09:47
wikingit's either via implementing data_lock et al or doing the store_model_features09:47
sonne|workhehe it is clear that we need both09:48
wikingheheh in a long run yeah09:51
wikingbut i just want to have xval support for MC kernel machines kind of like now09:51
-!- cronor [] has quit [Quit: cronor]10:02
sonne|workwiking: I guess you have only few examples so precomputing is no  problem10:22
-!- n4nd0 [] has quit [Ping timeout: 272 seconds]10:28
wikingsonne|work: you mean by splitting the dataset by hand? :)10:35
sonne|workhow big is it?10:35
wiking200k examples10:36
wikingwith 20+ labels10:36
sonne|workthen data lock won't help you10:37
sonne|workIIRC that will precompute the kernel matrix10:37
sonne|work-> boom boom10:37
sonne|workbut I would rather use a linear method for that...10:38
-!- abn_ [av3ngr@nat/redhat/x-sjdrwpxzkatjgopq] has quit [Quit: That's all folks!]10:42
wikingbtw i was thinking to extend CrossValidationResult for MC case10:43
wikingso that we'll be able to print as well not just the mean accuracy10:43
wikingbut lets say mean accuracy per class10:43
wikingany objetions10:43
wikingand actually have the possibility for per class precision, recall, F1 for mc as well... that could be done today imho10:45
sonne|workwiking: feel free... maybe add some flag or so to dis/enable to. not sure exactly whether this should not be sth in evaluation10:46
sonne|workI mean like we have contingencytable evaluation10:46
wikingsonne|work: ok10:46
sonne|workwith f1/accuracty/.. for binary classification10:46
sonne|workyou could do the same for mc10:46
wikingthat's what i thought10:46
wikingfollow the same idea10:46
wikingfor MC10:46
wikingok i'm off for 2 hours for a meeting10:47
wikingand i'll be doing these after that10:47
wikingand help u out as well in the leaking of matrix10:47
wikingand i guess i should rather write an email about store_models_feature on the mailing list10:47
wikingsince pluskid is not around10:48
wikingi guess he is still sick or something10:48
sonne|workyeah - lets hope he recovers soon10:51
wikingok email sent to the mailing list10:53
wikinglets see the reactions10:53
wikingok i'm off. bbl10:53
-!- wiking [~wiking@huwico/staff/wiking] has quit [Quit: wiking]11:04
-!- naywhayare [] has quit [Ping timeout: 240 seconds]11:09
-!- naywhayare [] has joined #shogun11:09
-!- cronor [~cronor@] has joined #shogun11:25
cronorWould somebody check my code (20 lines) for CombinedKernel? If I use one Kernel in MKL everything works fine, if I add the same kernel again, the results become really bad. I guess I use CombinedKernels wrong but can't find help in the examples or documentation. Code:
-!- blackburn [~qdrgsm@] has joined #shogun11:38
cronoryou're done with your exam?11:40
blackburncronor: you had some mkl trouble, right?11:44
cronoryes, i just posted before you came in11:47
cronorWould somebody check my code (20 lines) for CombinedKernel? If I use one Kernel in MKL everything works fine, if I add the same kernel again, the results become really bad. I guess I use CombinedKernels wrong but can't find help in the examples or documentation. Code:
blackburnsonne|work: densefeatures11:54
cronorblackburn: do I have to use CombinedFeatures, too?11:54
blackburncronor: ok looks ok..11:56
blackburnI am not sure while you use custom kernels11:57
blackburncronor: but why do you attach *same* kernels?11:58
cronorblackburn: just for testing. if it works with one kernel and i attach the same kernel again, the results should be exactly the same. but they are not11:58
cronorso i know there is something wrong with the code and not with features11:59
blackburncronor: it seems to be a little confusing for me12:02
blackburnwhat should be weights then?12:02
cronorthe weights should be 0.7 and 0.7 for 2-norm. this works, shogun gets this12:03
blackburncronor: so weight are correct?12:05
cronorblackburn: yes weight's are correct. but i don't think this is relevant. the results for using 2 kernels should not be worse than for using 1 kernel12:06
blackburnbut apply is wrong?12:07
blackburncronor: what do you compare, accuracy?12:08
cronorblackburn: relative absolute error12:08
cronorthe error with one kernel is 0.12, with two kernels 0.8, there must be something wrong12:09
blackburnah regression12:12
cronorno i take abs(truth-pred)/max(abs(truth), abs(pred))12:13
cronorbut mse has the same problem12:14
blackburncronor: what if you manually set weights to 1.0, 0.0?12:17
cronorhow do you set manually?12:17
cronori can't find a function for that12:20
blackburncronor: set_subkernel_weights in kernel12:23
cronorblackburn: ah were do i set that? after mkl.train()?12:24
blackburncronor: yeah you can try it before apply just to check whether with two kernels but one zero-weighted you have the same result12:24
cronorblackburn: ok, i'm a little confused as to how the test kernel would get to know about the weights for example. maybe there is more wrong, can you check the added lines?
blackburncronor: btw are your kernels proper normalized?12:28
cronori use the same kernel twice, so i don't need to worry about that here, right? in general i do generalize12:29
cronorblackburn: i have to leave for lunch, but i'll be back in 30 min. then i'll try the manuel setting of sub kernel weights12:30
-!- cronor [~cronor@] has quit [Quit: cronor]12:30
blackburncronor: yes it should disable second kernel, does it make sense?12:30
-!- blackburn [~qdrgsm@] has quit [Ping timeout: 244 seconds]12:35
-!- blackburn [~qdrgsm@] has joined #shogun12:36
-!- blackburn [~qdrgsm@] has quit [Ping timeout: 244 seconds]12:50
-!- gsomix [~gsomix@] has quit [Quit: Ex-Chat]12:50
-!- gsomix [~gsomix@] has joined #shogun12:50
-!- blackburn [~qdrgsm@] has joined #shogun12:52
-!- blackburn [~qdrgsm@] has quit [Ping timeout: 244 seconds]12:59
-!- blackburn [~qdrgsm@] has joined #shogun13:01
-!- blackburn [~qdrgsm@] has quit [Ping timeout: 244 seconds]13:10
-!- blackburn [~qdrgsm@] has joined #shogun13:10
sonne|workblackburn: you like the name DenseFeatures too?13:27
blackburnsonne|work: sure13:27
blackburnit is dense why simple?13:27
sonne|workbecause it was dense :D13:28
blackburnit is dense but called simple humm13:28
blackburnsonne|work: I was pretty near to fail exam ;d13:29
blackburnwell not fail but bad grade13:29
gsomixblackburn, 3?13:30
blackburnno, got 413:30
blackburnI don't care at all - but no 3 grades allowed :D13:31
-!- cronor [] has joined #shogun13:32
cronorblackburn: when i set the kernel weight in K_tr, does it affect the kernel weight in K_te?13:33
blackburnhow do they relate?13:34
cronorI do mkl.set_kernel(K_tr), mkl.train(), mkl.set_kernel(K_te), mkl.apply().get_labels(). I can imagine that the kernel weights that where learned for combined kernel K_tr do not affect the kernel weights for K_te13:35
blackburnyeah makes no sense to train one kernel and set another one later13:36
cronorso how could i do this? use test and train kernel in MKL?13:36
sonne|workcronor: well you can check this - just get the kernel weight and compare if it is differnt13:38
cronorsonne|work: yes, i'll try13:38
-!- vikram360 [~vikram360@] has quit [Ping timeout: 260 seconds]13:39
-!- vikram360 [~vikram360@] has joined #shogun13:39
-!- wiking [] has joined #shogun13:41
-!- wiking [] has quit [Changing host]13:41
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun13:41
wikingyooo back!13:42
-!- vikram360 [~vikram360@] has quit [Ping timeout: 260 seconds]13:49
-!- vikram360 [~vikram360@] has joined #shogun13:50
-!- pluskid [~pluskid@] has joined #shogun13:50
wikingpluskid: hey how are you? any better?13:52
pluskidwiking: hmm, better, maybe :)13:52
pluskidwiking: reading the email13:52
wikingpluskid: yeah we need you :)13:53
pluskidwhy do we need to store the features in store_model_features?13:53
wikingso take your time, get better first!13:53
wikingpluskid: xval13:53
wikingw/o that function we cannot do cross validation now13:53
wikingon MC kernel machines13:54
pluskidI mean, why not store index into the big kernel matrix?13:54
pluskidwe have the kernel for all data, don't we?13:54
wikinghonestly this is not really my 'genre', so i guess you should rather talk about this with heiko or sonne|work13:54
sonne|workpluskid: let me explain13:54
-!- vikram360 [~vikram360@] has quit [Ping timeout: 260 seconds]13:55
wikingi just want xval support :)13:55
sonne|workfor kernel machines we have the vectors and their coefficients13:55
sonne|workso that is what in the end makes up the machine13:55
sonne|workso if you did a training you in the end need both for a functioning machine13:55
sonne|workwe started with having just the indices into training data13:56
pluskidso why can't we do this in cross validation?13:56
sonne|workbut that is no good in most common cases13:56
sonne|workfor multiclass the picture is different13:56
sonne|workit is much more efficient to just store the relevant support vectors once and indices into them13:57
pluskidI see13:58
sonne|workor keep the training set and the index if there are too many13:58
pluskidI mean, if we forget about cross-validation for a moment13:58
sonne|workhowever currently we have several SVMs13:58
sonne|workwhich are the result of a multiclass training13:59
pluskidfor multiclass kernel machine13:59
sonne|workand all of them store the features13:59
sonne|workand that is not efficient13:59
pluskidsonne|work: so we need to make it efficient13:59
pluskidisn't it?13:59
sonne|workso we need a general scheme to only store the indices and the training data as union of all SVs (or as a whole)13:59
sonne|workpluskid: well it is not implemented because of this14:00
pluskidso this is not (only) relevant to cross validation, am I correct?14:00
sonne|workand this should work with general ecoc schemes then ...14:00
pluskidOK, I see14:00
sonne|workin x-val it is more important though because you modify training data all the time14:00
sonne|work(by setting different subsets)14:00
pluskidwe are using a single copy of features, and subset to index into them in MC training14:02
pluskidcan we do this for support vectors?14:02
pluskidI mean only store the SV as subset into the training data, instead of making a copy14:03
sonne|workpluskid: no because when the subset is changed the indices become invalid14:05
sonne|workwe would need support for multiple subsets (aka views) for data14:05
pluskidI guess I saw something like subset_stack somewhere?14:06
pluskidsonne|work: why subset changes? do you re-train the model?14:07
sonne|workpluskid: yeah but that is just one subset consisting of a couple of stacked subsets14:07
pluskidsonne|work: oh, I see, we need two subset, one for features, one for SVs, right?14:09
pluskidcurrently they can't exist at the same time14:09
sonne|workone for cross validation subset14:10
sonne|workon top of it we have just some int32_t* into the subset14:10
pluskidwe need something like feature_views ?14:10
sonne|workpluskid: or we just store all SVs in the trained mc machine as copy14:12
sonne|workand all good14:12
blackburnsonne|work: as is?14:12
blackburnor indices?14:12
sonne|workas is14:12
blackburnsonne|work: 100000 svs?14:12
sonne|workplus indices for each ecoc scheme14:12
sonne|workblackburn: that is the way we do it for binary now14:13
sonne|workif data gets bigger no one will use x-val anyways14:13
blackburnhmm right14:13
blackburnbut still can be more efficient14:13
sonne|workyeah but in the end the trained model is: all SVs, + their coefficient14:14
blackburnsonne|work: do we need to unref matrix members?14:15
blackburnsonne|work: in case we have m_matrix in class14:15
blackburndo we need to unref it on destruction of class?14:15
sonne|worksame for sgvector14:15
blackburnsonne|work: so dense features are b0rken right now?14:16
blackburneverything fails..14:16
sonne|workblackburn: we need to convert simple features to use SGMatrix internally14:17
sonne|workotherwise everything fails14:17
blackburnI can try14:17
blackburnif you didnot start yet14:17
blackburnhowever I got almost no sleep tonight - can be powered off anytime :D14:18
sonne|workI know what you mean...14:18
pluskidI'll look at some of current code first, not quite familiar with our current way14:19
blackburnI invented sleep study method :D read lecture notes - sleep - read lecture notes - sleep14:20
sonne|workblackburn: or do some simple thing first please and remove SG_VUNREF and replace it with vec.unref()14:23
blackburnsonne|work: ok14:24
sonne|workI think this only appears in simple features and at some point we don't really need it :)14:24
blackburnsonne|work: I should try to help as much as I can next days because it really breaks workflow14:25
-!- emrecelikten [~Anubis@] has joined #shogun14:25
sonne|workblackburn: yeah we also need SGSparseMatrix conversion, Stringlist etc14:25
blackburnhmm vunref appears once14:26
blackburnsonne|work: strunkwhat?14:26
blackburnis it german word for that?14:26
blackburnsonne|work: SGStroka ;)14:26
sonne|workSGStroika :)14:26
emreceliktenhi all14:27
blackburnsonne|work: ah btw14:28
blackburnin kernels I believe we need to change compute14:28
blackburnI think to reduce code it would be nice to redefine method that works in vectors context already14:28
pluskidsonne|work> pluskid: or we just store all SVs in the trained mc machine as copy14:28
pluskid^^^^^^ this is our current impl, right?14:28
pluskidfor both binary and mc14:29
blackburnpluskid: yes for binary14:29
sonne|workpluskid: no we don't have anything for mc14:29
blackburnfor mc notimplemented you know14:29
blackburnfor linear mc you don't need to store any model in means of vectors14:29
-!- karlnapf [] has joined #shogun14:29
pluskidfor kernel mc, store for each binary machine14:30
sonne|workno better store all required SVs once14:30
sonne|workand in the binary submachines just the indices14:30
blackburnmakes sense14:31
blackburnbut pretty complex thing14:31
pluskidmight have to store an extra copy of Features14:31
blackburnone would need to collect all indexes of SVs14:32
sonne|workrequires an extra copy14:32
blackburncopy these vectors from lhs features14:32
blackburnand setup indices in each machine14:32
gsomixkarlnapf, hey.14:32
karlnapfhey gsomix14:32
sonne|workyou could use gsomix' CSet to obtain the features14:33
sonne|workthe indices14:33
sonne|workand then just create the copy14:33
blackburnno cset anymore? :D14:33
blackburnit is map :D14:33
sonne|worknah it will come back14:33
pluskidmaybe need to make a whole copy, because we don't know which features are needed at first14:34
blackburnto obtain features std's set is ok ;)14:34
gsomixsonne|work, it will come back | ok.14:34
blackburnsonne|work: to make you feel more comfortable14:34
-!- vikram360 [~vikram360@] has joined #shogun14:34
blackburnI'll #define CSet std::set14:34
sonne|workpluskid: ?14:35
sonne|workpluskid: you know after training which vectors are needed14:35
gsomixkarlnapf, what about trace? it works?14:35
sonne|workyou have the indices14:35
karlnapfgsomix, trace?14:35
pluskidsonne|work: need to do union for all cross-valid trains14:35
sonne|workpluskid: no14:35
sonne|workper machine14:35
pluskidper mc-machine?14:36
sonne|workkarlnapf: right? x-val stores model features per trained machine?14:36
karlnapfgsomix, havent checked yesterday, will try now14:36
sonne|workkarlnapf: beware of SGMatrix transition14:37
karlnapfsonne|work, yes model features per trained machine14:37
sonne|workpluskid: see ^14:37
karlnapfsonne|work, oh this will get messy :) There arent all memory leaks yet fixed, the SGVector transition kind of fucked up the migration parameter stuff, Ill have to get into it again :(14:37
sonne|workkarlnapf: would it be possible to drop the trained machine?14:37
karlnapfgsomix, yes trace -mallocs works14:38
karlnapfat least there are less errors14:38
karlnapfbut still lots of14:38
blackburnkarlnapf: can you please formulate algorithm requiring store model features?14:38
blackburnstep by step :D14:38
blackburnto clarify that14:38
karlnapfyes sure, give me a second14:39
sonne|workkarlnapf: I mean then you could determine optimal parameters14:39
sonne|workbut don't have the machine afterwards14:39
sonne|workbut you could get it by training again14:39
karlnapfehm, you lost me ;)14:39
-!- vikram360 [~vikram360@] has quit [Ping timeout: 240 seconds]14:39
sonne|workwoudl be useful if you don't have enough memory to store machines for 1000 parameter settings14:39
karlnapfgsomix, seems like trace-mallocs gives the same number of errors as without it :) nice14:40
gsomixkarlnapf, which tests?14:41
karlnapfsonne|work, blackburn could you summarize the problem once more for me?14:41
sonne|workkarlnapf: there is no problem14:42
sonne|workat least from my side14:42
sonne|workwe just need store model features for mc14:42
karlnapfand store_model_features basically has to make sure that the data that represents the model is stored internally, so kernel machine stores all features corresponding to SV in the LHS of the kernel14:42
karlnapfah ok14:42
sonne|workand to do that efficiently we need to store all SVs once14:42
sonne|workand indices for each sub machine14:42
karlnapfwhat about the way I suggested in the mail this morning?14:42
sonne|workI am just asking if one could avoid storing the machine at all14:43
karlnapfyes fo course, but then x-val vanishes14:43
sonne|workkarlnapf: isn't your mail the same I suggested14:43
sonne|workexcept that it doesn't use subsets?14:43
sonne|workkarlnapf: why?14:43
sonne|workx-val could compute output score and good14:44
karlnapfyes, might be14:44
sonne|workso you know the best parameter setting14:44
karlnapfx-val needs to use a trained model to apply14:44
karlnapfwhich is done by subsetted features14:44
karlnapfwhen you then switch to another subset and apply this doesnt work14:45
karlnapfunless you store the data14:45
sonne|workkarlnapf: ahh you mean because of train/test data split?14:45
sonne|workbut do you drop the machine afterwards?14:45
karlnapfyou can only apply on training data if store_model_features is not implemented14:46
sonne|workI mean SG_UNREF ?14:46
karlnapfafter training?14:46
sonne|worktraining -> evalution -> SGUNREF ?14:46
karlnapfmmh, not sure, let me check14:46
blackburnsimple -> dense14:46
sonne|workblackburn: ?14:47
blackburnsonne|work: doing ;)14:47
sonne|workblackburn: did you remove the VUNREF first?14:47
sonne|work(and commit?)14:47
blackburnsonne|work: yes I removed the only VUNREF14:47
sonne|workand the macro too I hope14:47
karlnapfsonne|work, the machine is stored in the X-Val class14:47
karlnapfso not, its not dropped14:47
karlnapfonly if you drop x-val class by hand14:48
sonne|workkarlnapf: so what I am saying is that we need an option14:48
sonne|workto automagically drop it14:48
sonne|workthings might get to big otherwise14:48
karlnapfI dont get this14:48
sonne|workkarlnapf: think of training on 10k examples with several classes 1000 times :)14:48
blackburnoops I was editing shogun on different machine :D14:49
sonne|work(1000 parameter combinations)14:49
sonne|workso we don't want to store the 3000 SVs 1000 times14:49
karlnapfbut only the current SV set is stored14:49
sonne|workkarlnapf: so only a single machine is kept?14:50
karlnapfI am not sure if I get the problem though14:50
sonne|worknot one for each parameter setting14:50
karlnapfah wait14:50
karlnapfso you mean you have multiclass with 1000 machines14:50
karlnapfand each stores the SVs14:50
karlnapfand we dont want that14:50
sonne|worklets talk binary class for simplicity14:51
* pluskid starts to understand what to be done after scanning some code of cmachine and crossvalidation14:51
sonne|workyes we cannot store them all because they don't fit in memory14:51
pluskidsonne|work: I think only "current" and "best" machines are stored along model selection14:52
karlnapfbut if we cannot store the SVs in memory, we were not able to store data in memory before right?14:52
sonne|workkarlnapf: is it true what pluskid says - because that is what I am asking14:52
CIA-113shogun: Sergey Lisitsyn master * r3abc1ea / (2 files in 2 dirs): Removed vunref -
sonne|workif yes then all good14:53
karlnapfare we talking about model-selection or cross-validation?14:53
blackburnI am completely fucking lost with it14:53
sonne|workmodel selection14:53
karlnapfah ok, sorry then14:53
sonne|workwith grid search14:53
karlnapfthere is only one machine14:53
karlnapfparameters are applied to machine14:54
sonne|workok then all good14:54
karlnapfresults are stored14:54
karlnapfnext parameter combination14:54
karlnapfsee CGridSearchModelSelection::select_model14:54
karlnapfbtw store_model_features is related to x-val14:55
karlnapfnot to model selection14:55
sonne|workpluskid: ok but then I would say we collect all indices, store the needed vectors as copy in memory and in submachines the adjusted indices14:55
sonne|workand then all good14:55
karlnapfsonne|work, gsomix, yes, I think thats the best way :)14:56
pluskidsonne|work: ok, I'll do this now14:56
wikingwe have an agreement? :)14:58
pluskidwiking: as heiko's solution in the mailing list14:59
sonne|worklooks like :)14:59
wikingpluskid: r u implementing it now?14:59
sonne|workpluskid: but w/o subsets right?14:59
pluskidwiking: yes14:59
blackburnthe most painful thing to come in a hour!14:59
sonne|workI mean we have individual kernel machines anyways14:59
wikingpluskid: awesome!!!14:59
pluskidsonne|work: no subset, an extra copy of features14:59
sonne|workso we can use their set_support_vecvtor stuff15:00
blackburnprepare to change all your C++ shogun scripts :D15:00
sonne|workpluskid: err but only in MulticlassKErnelMachien15:00
pluskidsonne|work: yes, he will then adjust index for sub-machines15:00
sonne|workblackburn: DenseFeatures are coming?15:00
blackburnsonne|work: can I remove align_char_feautreS?15:01
wikingpluskid: i've already prepared a libshogun example for MC kernel machine xval15:01
sonne|workblackburn: is this uses anywaher15:01
wikingso i'm waiting for your commit :)15:01
sonne|workwiking: is the latent svm ready?15:01
blackburnsonne|work: no it is uflly ommcented15:01
wikingsonne|work: oh yeah !15:01
wikingsonne|work: it works :>15:02
wikingsonne|work: but only for this one and only example :)))15:02
wikingand i need to reflect on your email about PSI15:02
sonne|worklike the xval for mc here15:02
wikingsince i have a request there to do15:02
pluskidwiking: haha, cool!15:03
blackburnsonne|work: so ok to remove?15:04
blackburnsimple preprocessor goes to dense too15:04
wikingsonne|work: now it's time for some extra evaluation statistics for MC15:08
sonne|worksplattered all over15:08
blackburn'simple' word holocaust15:10
-!- emrecelikten is now known as emre-away15:10
blackburngsomix: when will you be ready for some bloody task?15:11
sonne|workblackburn: I think you should stick with stalin15:14
blackburnyeah he is bloody enough to support me here15:15
sonne|workthe battle of stalingrad...15:17
sonne|workno but actually that would be when I start to like std:: crap15:17
blackburnsonne|work: today could be a great day to defeat and start to like std ;)15:23
wikingmmm shit i cannot set a topic ;P15:27
wikingsonne|work: can u just append to the topic: | some of us hates std namespace here so don't even try ;)15:28
blackburnany sed expert here?15:31
wikingblackburn: what do u need?15:31
blackburnqdrgsm@qdrgsm-laptop:~/Shogun/shogun/src$ find shogun/converter | sed 's/<shogun/features/SimpleFeatures.h>/<shogun/features/DenseFeatures.h>/g'15:31
blackburnsed: -e expression #1, char 20: unknown option to `s'15:31
blackburnwiking: ^ what is wrong?15:32
wikingi would say15:32
wikingu need to escape <  and >15:32
wikingand /15:32
wikingr | sed 's/\<shogun\/features\/SimpleFeatures.h\>/\<shogun\/features\/DenseFeatures.h\>/g'15:32
blackburnwiking: hmm how to replace in each file?15:34
wikingwell first of all use -i15:34
sonne|workblackburn: find ./ -type f -exec sed -i ... {} \;15:35
wikingso that u do inplace replacement15:35
wikingand yeah that should be fine what sonne|work says15:35
blackburnwhat does he say?15:35
wikingthat do exec with find15:35
blackburnI did not mention it was sonne|work15:36
blackburnok my brain is damaged15:36
wikingbut u could do first a grep15:36
blackburnyeah I did15:36
wikingso that run the sed command only on the necessary files ;)15:36
blackburnsonne|work: thanks15:36
wikingbut that makes it a little more complicated so just go with applying that on every file15:37
wikingblackburn: but why don't u simply do sed -i -e 's/SimpleFeatures/DenseFeatures/g' ?15:38
blackburnyes I did it15:38
blackburnwanted to start with includes for some reason :D15:38
wikingwell this way u do the replacement everywhere not only with includes... ;P15:39
cronorwhat SVR does MKLRegression use, LibSVR or SVRlight?15:42
cronorblackburn: thaks15:42
pluskidI hate SG_UNREF as much as sonne|work hate stl15:54
pluskidcompiled but not tested15:55
wikingpluskid: checking15:56
wikingpluskid: went with std::set15:56
pluskiddidn't find CSet15:56
wikingheheh it has become CMap ;P15:57
wikingpluskid: commented16:01
pluskidwiking: got it16:02
wikingand added about includes as well16:04
wikingi don't see u using std::vector anywhere16:04
wikingbut yeah i see u using map16:04
pluskidwiking: was using vector, but later find copy_subset requires a SGVector, so replaced16:05
pluskidI'll fix the include16:05
wikingno worries i'm just saying16:06
wikingsonne|work: i think this goes against the policy about SG_REF that u were mentioning in the morning: inline CFeatures* get_rhs() { SG_REF(rhs); return rhs; }16:07
wikingor not...16:07
wikingsonne|work: btw: what has been the outcome of the conversation with vojtech about libqp?16:08
wikingpluskid: anyhow i'll download the patch and test it here locally16:09
pluskidwiking: tell me sth about SG_REF policy, I'm always confusing at this. Each time I have to open the source of the function I'm calling to see whether it did SG_REF or not, to determine whether I have to do SG_UNREF or not16:09
wikinghow can i download this commit as a raw file?16:10
pluskidwiking: maybe you can create a local branch and merge this16:10
pluskidthis might be easier16:10
pluskidgit remote add ...; git pull ...16:11
wikingpluskid: can u do this for me16:11
wikinggit format-patch -316:11
wikingand send the three 000* files to me16:11
pluskidsend you an email?16:11
wikingthnx a lot16:11
wikinggot it i'll do the tests now16:15
blackburnanybody aware of tool for resolving includes?16:17
blackburnunnecessary ones for example16:17
pluskidblackburn: my friend wrote a script for vim: , though I didn't tested it16:18
wikingpluskid: hehehe i've got a segfault16:18
-!- karlnapf [] has left #shogun []16:19
pluskidwiking: can you send me your test code for me to debug?16:19
wikingyep just doing it16:19
wikingpluskid: this is from gdb:
wikingand i'm just sending u now the code16:19
wikingpluskid: sent16:20
pluskidwiking: thanks16:20
pluskidseems sub-machine don't have a kernel? ...16:21
wikingyeah that's what it is saying16:21
-!- vikram360 [~vikram360@] has joined #shogun16:38
gsomixblackburn, a?16:41
blackburngsomix: bloody task?16:41
pluskidwiking: when I call kernel->init(lhs, rhs), lhs and rhs are SG_REF-ed, so I don't have to SG_REF them manually right?16:41
blackburnpluskid: right16:41
wikingpluskid: imho yes you shouldn't16:42
gsomixblackburn, not now, but you may talk about it.16:42
sonne|workpluskid: yes16:42
pluskidwiking: I think I fixed this16:42
wikingsend me the patch16:42
sonne|workpluskid: only ever call SG_REF when you later want to use the object16:42
pluskidthough it is still crashing, but everything is crashing now ...16:42
pluskidcrashing somewhere else16:43
sonne|workblackburn: done with DenseMatrix?16:43
sonne|workblackburn: please commit...16:43
blackburnsonne|work: almost..16:43
pluskidsonne|work: I see16:43
sonne|workpluskid: we are in SGMatrix transition... in case you use latest git everything crashes16:43
pluskidsonne|work: yes, I'm using latest git16:44
pluskidso survived SGVector hell and enters SGMatrix hell? Haha16:44
sonne|workpluskid: then don't use simplefeatures ...16:44
wikingpluskid: oh i see ok i'll just apply these changes manually :)16:44
sonne|workpluskid: and then the remaining SGString/StringList/SparseVector/SparseMatrix16:44
pluskidwiking: I can send you patch, a minute16:45
wikingthat's alright i see it in the pull request :)16:45
pluskidwiking: ah, OK16:45
pluskidsonne|work: and then SGObject :D :D :D16:46
pluskidwiking: if I do SG_REF(sv_features) manually, a crash can be avoided, but yet another crash will occur16:47
pluskidso I guess it is related to SGMatrix16:48
blackburnsonne|work: almost done16:52
blackburnit seems to be compileable at least16:52
blackburngsomix: pluskid: sonne|work: wiking: recall we planned doc weeks?16:54
pluskidblackburn: yeah16:55
blackburnI wish to create a list of responsible classes to doc16:55
blackburnfor each of you16:55
sonney2k_blackburn, done?17:02
sonney2k_blackburn, I need to do massive changes to simplefeatures...17:03
blackburnsonney2k_: some issue, 5 mins more17:03
sonney2k_don't have 5 min17:03
blackburnsonney2k_: you always can merge it17:03
blackburninternals can be easy mergred17:04
sonney2k_no way17:04
sonney2k_you can always merge it17:04
blackburnokay I can17:04
pluskidsonney2k_ still on the train? :p17:06
sonney2k_pluskid, yes17:06
pluskidtrain is slow today17:06
pluskidtime to sleep here17:06
pluskidgood night guys17:06
-!- pluskid [~pluskid@] has quit [Quit: Leaving]17:06
blackburnsonney2k_: how much time you would need? should I wait with this commit for now?17:07
blackburnit is not ready though17:07
wikingmaaatrix is crashing!!!17:07
blackburna lot of includes17:07
sonney2k_blackburn, yeah commit now17:07
wikingdejavu everywhere :D17:07
sonney2k_it is broken everywhere anyways...17:08
CIA-113shogun: Soeren Sonnenburg master * r9e4616e / src/shogun/machine/MulticlassMachine.cpp : minor whitespace changes -
CIA-113shogun: Soeren Sonnenburg master * rcf5e351 / src/shogun/multiclass/ConjugateIndex.cpp : fix crasher in conjugate index -
CIA-113shogun: Soeren Sonnenburg master * r7bb62a6 / (3 files): don't call destroy_matrix() - SGMatrix cleans up itself -
blackburnsonney2k_: do you have anything more to commit?17:23
CIA-113shogun: Sergey Lisitsyn master * rb4ef345 / (234 files in 17 dirs): Simple to Dense renaming -
CIA-113shogun: Sergey Lisitsyn master * r81a30d8 / examples/undocumented/libshogun/regression_gaussian_process.cpp : Restored gp example -
wikingsonne|work: where can i define DEBUG_SGVECTOR ?17:38
wikingor how do u turn it own...?17:38
blackburnwiking: config.h17:39
wikingsonne|work: so what's the idea with the referenced object, should i call SG_UNREF on it or we assume that it automagical?18:00
wikingsince now i have a double free on an SGMatrix...18:00
blackburnwiking: what do you mean?18:00
wikingso the thing is that i have malloc: *** error for object 0x10207c800: pointer being freed was not allocated18:01
wikingso for sure i have a double free18:01
wikingand it's with an SGMatrix18:01
blackburnwiking: do you use simplefeatures?18:01
blackburnit will fail18:01
blackburnlet me try to fix it18:02
wikingso the thing here is18:02
blackburnwiking: dense features should use sgmatrix internally18:02
blackburnso after get feature matrix is called18:02
wikingso if i comment out18:02
wikingthe SG_UNREF for the simple features18:03
wikingthen the whole thing runs ok18:03
wikingso this is why i think that the unreferencing/deleting is done automagically via the destructor18:03
wikingso i don't need to explicitly call sg_unref on the features18:04
blackburnwiking: no it is related to matrix18:04
wikingso i'm creating the simplefeautres like this: CSimpleFeatures< float64_t >* features = new CSimpleFeatures< float64_t >(mat.matrix, num_feats, num_vectors);18:05
wikingand the mat is an SGMatrix that has been allocated in the same function call before18:05
wikingso that's why i think i should not call an SG_UNREF (features)18:05
blackburnwiking: you should after this issue is fixed18:06
wikingi should??/18:06
wikinglet me grab the last commits18:06
blackburnwiking: no it is not fixed yet18:07
wikingok anyhow i want to see what's happening18:07
blackburnwiking: the problem is that after gotten feature matrix18:07
blackburnis destroyed18:07
blackburnit destroys matrix in features18:07
blackburnand features destroy it once again18:07
blackburnshiiiit 1K LoC18:10
* wiking porting the code to densefeatures18:11
blackburnwiking: I wanted to fix that issue in densefeatures..18:15
wikingblackburn: CStreamingSimpleFeatures -> CStreamingDenseFeatures18:15
wikingok works18:16
wikingnow i have to see what's the problem with18:16
wikingpluskid's patches18:16
-!- puffin444 [62e3926e@gateway/web/freenode/ip.] has joined #shogun18:46
sonney2k_wiking, blackburn touch Simple^H^H^HDenseFeatures and die!18:47
sonney2k_I am working on them18:47
puffin444How's everyone doing?18:48
cronorHow can i find out which indexes are support vectors?18:49
sonney2k_cronor, get_support_vectors()18:50
sonney2k_puffin444, peace, rock&roll and love - we have it all18:51
puffin444That sounds great! I'm sorry I haven't been in IRC for the last two weeks. I had an awful end of the semester. But now I can say that I am liberated from my scholastic duties for the time being. :)18:53
sonney2k_puffin444, very good - we are in the middle of a couple of transitions but hopefully we are back to normal in 1-2 weeks18:56
puffin444By the way I was invited to a GSoC meeting at Google Chicago18:57
puffin444Students are giving talks about their project. Would you mind if I gave one about my Project in Shogun?19:00
-!- vikram360 [~vikram360@] has quit [Ping timeout: 255 seconds]19:02
blackburnpuffin444: I would even help you with presentation19:04
blackburnif you want19:05
sonney2k_puffin444, and I still have slides about a shogun talk I gave...19:05
sonney2k_but I guess you want to present your project...19:05
puffin444It's not until May 23rd, so I have a good amount of time to prepare.19:07
cronorSo, if i append only one Kernel to a CombinedKernel and do MKLRegression i get worse results and totally different alphas than using this kernel in SVRLight. Is it save to assume that there is either an error in my code or in shogun?19:11
sonney2k_cronor, and if you use svmlight + combined kernel?19:28
cronorsonney2k_: same result19:29
sonney2k_cronor, like?19:29
cronorworse results and totally different alphas19:30
sonney2k_cronor, different parameters/19:30
cronorsonney2k_: no, same C, same epsilon19:31
wikingsonney2k_: hehehe ok19:31
cronori tried cross validating C again, but it did not improve19:31
sonney2k_blackburn, in LinearLocalTangentSpaceAlignment you call get_feature_matrix19:31
blackburnsonney2k_: yes?19:31
-!- emre-away [~Anubis@] has quit [Ping timeout: 248 seconds]19:32
sonney2k_do you keep it constant19:32
sonney2k_or modify it?19:32
blackburnsonney2k_: yes center it19:32
sonney2k_blackburn, or shall I commit and you fix things19:32
sonney2k_wait no19:32
sonney2k_you center matrix.matrix19:32
sonney2k_not feature_matrix19:32
blackburnsonney2k_: daxpies after are mean subtraction19:32
blackburnmean feature vector19:33
sonney2k_blackburn, so you need a copy19:33
blackburnyes or find a way to do it implicitly19:33
blackburnI can fix it19:33
sonney2k_ok let me commit then19:33
blackburnanyway it is all useless shit19:34
CIA-113shogun: Soeren Sonnenburg master * r56661f3 / (2 files): WIP: use SGMatrix inside of SimpleFeatures -
sonney2k_I know19:34
sonney2k_blackburn, can you please fix it *now*19:34
blackburnsonney2k_: you want to remove get_feature_matrix(&)?19:35
cronorsonney2k_: this are the two functions i use, they should output the same results (in theory)
shogun-buildbotbuild #877 of libshogun is complete: Failure [failed compile]  Build details are at  blamelist: sonne@debian.org19:37
sonney2k_blackburn, I already did19:40
blackburnsonney2k_: please do not fix gui I am working on it19:41
blackburnsonney2k_: how matrix should be destroyed?19:42
sonney2k_blackburn, not at all19:43
sonney2k_or amtrix.unref()19:43
blackburnsonney2k_: SG_FREE(feature_matrix) in topfeatures19:43
sonney2k_blackburn, please commit again19:44
CIA-113shogun: Sergey Lisitsyn master * r02efa8e / (4 files in 3 dirs): A bunch of fastfixes -
blackburnsonney2k_: kind of pair programming lol19:46
sonney2k_blackburn, free_feature_matrix()19:46
sonney2k_would be the call19:46
blackburnsonney2k_: commited19:47
blackburncould you please check?19:47
sonney2k_hmmhh a better name for the commit would have been better19:47
blackburnsonney2k_: do you care about commit msgs right now? :D19:48
sonney2k_blackburn, yes19:48
sonney2k_blackburn, please check that in GUI19:48
blackburncheck what?19:48
sonney2k_fmatrix is not SG_FREEd19:48
sonney2k_because SGMatrix(fmatrix,...) will do that19:48
sonney2k_(double free...)19:48
blackburnhow should it look like?19:48
sonney2k_blackburn, please add a clone() method to SGMatrix()19:49
CIA-113shogun: Sergey Lisitsyn master * r248d9a8 / src/shogun/ui/SGInterface.cpp : Proper matrix handling in SGInterface - now clones matrices and frees given ones -
blackburnsonney2k_: happy with msg? :D19:52
blackburnin the next one I'll write a poem for you ;)19:57
sonney2k_I am so happy19:58
sonney2k_one more summer where I can sit outside19:58
blackburnare you near to that airport?19:59
wikingblackburn: are we there yet? :)19:59
sonney2k_blackburn, I am not exactly near that airport but near to berlin's biggest lake20:01
blackburndoes it produce echo of that kind of sounds?20:01
sonney2k_and they decided just last year that over hundred planes will go over the lake and the place where I am living20:01
sonney2k_night and day20:01
blackburnI listen to ambulance cars night and day20:02
sonney2k_and actually for the last 10 years no one ever mentioned that - all plans were showing other routes20:02
wikingblackburn sonney2k_ classifier/NearestCentroid.cpp:107:45: error: too many arguments to function call, expected 1, have 320:03
blackburnwiking: cool20:03
wikingi'mjustsayin' sorry20:03
sonney2k_that's the lake20:04
sonney2k_very nice there20:04
blackburnwhat is muggel?20:04
wikingsonney2k_: you live quite outside of berlin :)20:04
sonney2k_mueggelsee :)20:04
sonney2k_the name20:04
sonney2k_no idea20:05
wikingsonney2k_: how much is the commuting time?20:05
sonney2k_wiking, still berlin :)20:05
sonney2k_20 minutes20:05
wikingsonney2k_: yeah i know berlin is huge20:05
wikingu take the 'metro' :)20:05
sonney2k_wiking, was living next to the tv-tower for many years20:05
sonney2k_so enough20:05
sonney2k_wiking, no s-bahn20:05
wikingsonney2k_: Heye s-bahn that is :)20:05
* sonney2k_ checks the water temperature of the lake20:06
wikingi liked the east part only of berlin20:06
-!- PhilTillet [~Philippe@] has joined #shogun20:06
wikinghad some funny nights while i was there :D20:06
sonney2k_15 C20:06
sonney2k_very soon I know how to spend my evenings :D20:06
wikingcan u sail on it?20:06
sonney2k_but I usually swim across it20:07
wikingthen it must be the ideal place20:07
wikingu live in berlin20:07
wikingand have a lake20:07
blackburnsonney2k_: 2 km??20:07
wikingwhere u can swim/sail20:07
wikingsonney2k_: is it as cheap as berlin :DDD20:07
sonney2k_blackburn, in total - yes about that20:07
sonney2k_prizes did increase...20:07
wikingthe rents must be higher than in the downtown20:08
blackburnthat would be crazy difficult to swim 2km for me :D20:08
wikingi mean the last time i was in berlin (2010) the rent was ridicolous20:08
sonney2k_blackburn, you know I am doing all kinds of long distance swimming20:08
sonney2k_so not for me20:08
sonney2k_I think I am developing into a whale20:08
wikingi mean compared to other cities in EU or in germany...20:09
blackburnI heard something about it once :)20:09
sonney2k_slow once on land20:09
blackburnwiking: what is life costs in belgium then? ;)20:10
wikinga lot20:10
wikingit's really insane here20:10
blackburnwiking: is phd well paid?20:10
wikingthe shittiest studio in a smaller town (not brussels) is around 500 euros/month20:10
wiking+ utilities20:10
wikingblackburn: not really...20:11
blackburnwiking: small one room flat in moscow costs ~1000$ per month :D20:11
wikingbut i wonder where it is well paid :)20:11
wikingblackburn: hahahah yeah that's moscow...20:12
wikingit's fucking insane there20:12
blackburnhere it is 300-400$20:12
wikingi've heard that it's getting more expensive than tokyo20:12
blackburnwiking: I am happy to live here almost for free :D 40$ per month20:13
puffin444That much in Moscow?20:13
blackburnpuffin444: yes, avg salary is about 2-2.5K$20:14
blackburnand rent can take 50% of it :D20:15
puffin444How does that economics of that work if rent is .5 of salary?20:15
wikingblackburn: ahahahaha 40$ :)20:15
blackburnpuffin444: no idea I never lived there and do not want to :D20:16
wikingblackburn: but for example back in serbia i know people renting a 300 sqrmeter house for 80 euros20:16
puffin444It seems that enough people want to though to drive up the rent.20:16
blackburnwiking: yeah it just depends on local insanity20:17
sonney2k_rent in berlin used to be cheap - I used to live in a 85sqm flat for 450 EUR20:18
blackburnI have 12 sqm room here :D :D20:18
sonney2k_but last year prizes went up quite a bit20:18
sonney2k_they basically doubled20:18
wikingsonney2k_: heheh that's insane. i knew people paying the double price for the same size in wien20:18
wikingand that was about 6 years ago20:19
wikingand wien is a shittier place than berlin in my opinion20:19
sonney2k_I guess someone should continue sgmatrix bugfixing20:19
puffin444Are people just coming in droves to live in these cities? I wonder whats driving up the rent.20:19
-!- cronor [] has left #shogun []20:19
wikingsonney2k_: can I TOUCH IT THEN?!20:19
wikingi thought i shouldn't20:19
wiking"sonney2k_: wiking, blackburn touch Simple^H^H^HDenseFeatures and die!"20:20
sonney2k_something wicked is going on with my connection to github again20:21
blackburnnein ich moechte nicht to die20:22
blackburnwiking: iz dat speling correkt?20:22
blackburnsonney2k_: gestorben!20:23
wikingbut i think that is died20:23
wikingfor sure past tense20:23
wikingsterben is the present tense20:24
blackburnsonney2k_: sterben!20:24
wikingbut that means to die20:24
wikingbut actually if u use moechte20:24
wikingthen it should be alright20:24
wikingsince it should be in infinitive mode20:24
wikingor wtf20:24
blackburnI will let soeren know of new words I get to know20:24
blackburnsonney2k_: moechte!20:25
wikingread some chomsky noam20:25
sonney2k_blackburn, did you do sGMatrix.clone()?20:25
wikingit'll be good for you :D20:25
sonney2k_or shall I20:25
blackburnsonney2k_: where?20:25
wikingblackburn: but remember this name when u will learn about formal languages ;)20:25
sonney2k_blackburn, that says it all20:26
sonney2k_let me do it20:26
wikingS->SaSb and such20:26
blackburnsonney2k_: I did it in sginterface already20:26
CIA-113shogun: Soeren Sonnenburg master * r70b2c60 / src/shogun/lib/SGMatrix.h : add clone() method to SGMatrix -
blackburnsonney2k_: ah we had no clone :D20:29
blackburnI didn't know tha20:29
wikinghahah lol: Jesus has been in Tesco again
sonney2k_return SGMatrix<float64_t>(res,num_vectors,cur_dim_feature_space);20:35
sonney2k_blackburn, ^20:35
sonney2k_that seems wrong20:35
sonney2k_it is rows,cols20:35
sonney2k_and dims == rows20:35
sonney2k_gsomix, any news about CSet returning?20:37
sonney2k_blackburn, ahh btw did you see wikings msg - we cannot get rid of some of the load/save stuff20:38
gsomixsonney2k_, tomorrow, at morning20:39
gsomixsonney2k_, I'm making some progress with Array#->DynamicArray.20:39
sonney2k_gsomix, ok20:39
gsomixbut my android phone have said that I should go to bed :)20:40
sonney2k_gsomix, I need to talk to google about some kind of remote patch :D20:40
blackburnsonney2k_: why you complain about random fourier to me? ;)20:40
wikingNearestCentroid !20:41
sonney2k_ALL YOUR FAULT :D20:41
blackburnsonney2k_: which msg?20:41
blackburnsonney2k_: I would rather remove it at all - too hard to support20:41
blackburnand random fourier is not cool anymore20:42
sonney2k_blackburn, you always want to remove everything20:42
sonney2k_until we are stuck at int main() {}20:42
blackburnsonney2k_: lie :)20:43
sonney2k_someone grep the logs20:43
gsomixgood night guys20:43
blackburnsonney2k_: I want to remove only stuff I can't support20:44
blackburnand you don't have time to support too20:44
wikingbazdmeg a joisten faszat20:45
wikinghath nem igaz hogy nem latod  hogy a rohadt mocskos nearestcentroid se mukodik bassza meg20:46
* wiking is on the rage20:46
blackburnwiking: hungrian obscene?20:47
wikingno not at all :D20:48
blackburnwiking: do you know russian strong words? :D20:48
wikingno just serbian :>20:48
blackburnhow it sounds like?20:49
CIA-113shogun: Soeren Sonnenburg master * rde8733d / (7 files in 5 dirs): various sgmatrix compile fixes -
wikingjebem ti mater20:50
wikingpicka ti materina20:50
blackburn?? ?? ???? ????!20:51
sonney2k_blackburn, do things compile for you too?20:51
blackburnsonney2k_: ?????? ???????20:51
wikingask me sonney2k_ !!!20:51
gsomixblackburn, ?? ????????, ?????.20:52
sonney2k_wiking, !20:52
sonney2k_yes everything crashes now!20:52
blackburnsonney2k_: ?????20:52
-!- puffin444 [62e3926e@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]20:52
sonney2k_converting simplefeatures to sgmatrix is probably more intrusive than sgvectro transition20:52
sonney2k_but hey multiclasslibsvm works20:53
sonney2k_ahh examples don't compile20:53
blackburnwhat vim?20:53
wikingsonney2k_: lib compiled20:55
sonney2k_python examles seem to run mostly too20:55
shogun-buildbotbuild #881 of libshogun is complete: Success [build successful]  Build details are at
sonney2k_blackburn, can you have a look at failing python examples20:56
sonney2k_lots of converters fail20:56
* sonney2k_ continues with libshogun examples20:56
blackburnsonney2k_: yeap20:57
-!- gsomix [~gsomix@] has quit [Ping timeout: 252 seconds]20:57
sonney2k_yes sir shogun sir!20:57
sonney2k_next time please20:57
blackburnnoo way20:57
sonney2k_resistance is futile!20:57
wikingmy example works \o/20:59
wikingnow i have to check pluskid's patches20:59
-!- blackburn1 [~qdrgsm@] has joined #shogun21:01
blackburn1la restistance!21:02
-!- blackburn [~qdrgsm@] has quit [Read error: Connection reset by peer]21:02
wikingok i don't get this21:03
wikingWTF is the concept here please somebody enlighten me21:03
sonney2k_wiking, ???21:03
wikingi use an SGMatrix for a densefeature21:04
wikingif i SG_UNREF the densefeatures21:04
wikingthen i get a double free error21:04
wikingif i don't SG_UNREF21:04
wikingthen valgrind says that there's a leak21:04
wikingwith DenseFeatures21:04
sonney2k_you should unref21:04
wikingok unref kills it21:04
wikingerror for object 0x7ffe9c07c800: pointer being freed was not allocated21:04
wikingand it's because if i call SG_UNREF on the features21:05
wikingthen it'll free the sgmatrix21:05
sonney2k_blackburn1, you missed a couple of SimpleFeatures21:05
wikingbut then on the end when the function ends the SGMatrix's21:05
wikingdstor is being called21:06
wikingand that tries to free as well21:06
blackburn1sonney2k_: hmm where?21:06
sonney2k_blackburn1, in libshogun examples ... fixing21:06
sonney2k_wiking, sounds impossible21:06
blackburn1how can that be.. I replaced Simple->Dense21:06
sonney2k_wiking, check with valgrind21:06
wiking(gdb) bt21:07
wiking#0  0x00007fff980cbce2 in __pthread_kill ()21:07
wiking#1  0x00007fff8f0ec7d2 in pthread_kill ()21:07
wiking#2  0x00007fff8f0dda7a in abort ()21:07
wiking#3  0x00007fff8f13c84c in free ()21:07
wiking#4  0x0000000100002c90 in shogun::SGMatrix<double>::free_data (this=0x7fff5fbfe528) at SGMatrix.h:14921:07
wiking#5  0x0000000100002dba in shogun::SGReferencedData::unref (this=0x7fff5fbfe528) at SGReferencedData.h:11021:07
wiking#6  0x0000000100002e36 in shogun::SGMatrix<double>::~SGMatrix (this=0x7fff5fbfe528) at SGMatrix.h:5221:07
wiking#7  0x00000001000027e5 in shogun::SGMatrix<double>::~SGMatrix (this=0x7fff5fbfe528) at SGMatrix.h:5121:07
wiking#8  0x0000000100002454 in test_cross_validation () at evaluation_cross_validation_multiclass.cpp:14621:07
wiking#9  0x000000010000252e in main (argc=1, argv=0x7fff5fbfe620) at evaluation_cross_validation_multiclass.cpp:15421:07
wikingso that's what's really happening21:07
blackburn1ok can be21:08
-!- blackburn1 is now known as blackburn21:08
blackburnI did some mess there when merging21:08
wikingunless i should call an SG_REF on the SGMatrix before passing it to DenseFeatures21:09
wikingbut i guess i shouldn't do that...21:09
sonney2k_things compile again :)21:10
CIA-113shogun: Soeren Sonnenburg master * r9c81f4e / (18 files):21:11
CIA-113shogun: fix compilation of libshogun examples21:11
CIA-113shogun:  -
wikingok so in DenseFeatures.cpp:275 ::set_feature_matrix, shouldn't there be a SG_REF(matrix) somewhere within that function?21:11
sonney2k_wiking, no why?21:12
wikingsonney2k_: well then it'll happen what i'm having now21:12
wikingyou'll have a double free21:13
sonney2k_the copy constructor/assignment operator take care of that..21:14
wikingi mean if u pass a matrix to a feature that will use the matrix21:14
wikingimho if we follow what ref counting means21:14
blackburnref is +1 on copy21:14
blackburnso set feature matrix increases ref count21:15
wikingthen we should increment the reference counter on the matrix21:15
wikingwell yeah let's say it should take care of it21:15
wikingit does not21:15
sonney2k_wiking SGMatrix a,b21:15
sonney2k_will destroy a21:15
sonney2k_and inc refcoutn of b21:15
sonney2k_same with sgvector21:15
wikingmoreover if i use new CDenseFeatures< float64_t >(mat);21:15
wikingmat is an SGMatrix21:16
wikingthe whole thing crashes21:16
sonney2k_trace it down!21:16
blackburnfixing converters21:17
sonney2k_wiking, please paste your minimal example21:22
wikingi'm tracing21:22
wikingso now i've ended up currently21:22
wikingthat the matrix's ref counter is 121:23
wikingbefore it's dtor is being called21:23
wikingand that should be the ideal case21:23
sonney2k_except if someone else still has a ptr to the object and wants to destroy it21:24
wikingwell not anymore...21:24
wikingsince the dtor is being called because the function returns21:24
sonney2k_blackburn, converter_isomap21:25
sonney2k_==27272== HEAP SUMMARY:21:25
sonney2k_==27272== ERROR SUMMARY: 301 errors from 3 contexts (suppressed: 4 from 4)21:25
blackburnsonney2k_: IKNOW21:25
sonney2k_holds the record!21:25
blackburnsonney2k_: I reuse matrices => double frees21:25
wikingso somebody somehow already frees21:25
wikingi mean freed the memory21:26
wikingalthough the ref count was >021:26
sonney2k_maybe we have a conceptional bug21:26
-!- cronor [] has joined #shogun21:27
sonney2k_wiking, calling matrix.unref()21:27
sonney2k_will set the matrix ptr and everything to NULL21:27
blackburnmy ccache cache is not filling up21:28
blackburnsonney2k_: how did you config it?21:28
sonney2k_so calling unref() again is not a problem21:28
sonney2k_blackburn, we have to move code back to .cpp (in sgvectr/matrix ...)21:28
blackburnsonney2k_: why?21:28
sonney2k_wiking, except if the refcount ptr is not set to NULL21:28
sonney2k_then unref could be done twice21:28
* sonney2k_ checks21:29
-!- cronor [] has quit [Quit: cronor]21:30
-!- cronor [] has joined #shogun21:30
-!- cronor [] has quit [Client Quit]21:30
sonney2k_wiking, seems like I didn't set m_refcount to NULL21:31
sonney2k_so an x.unref() followed by another x.unref() will decr. the counter twice21:31
sonney2k_which doesn't make sense21:31
sonney2k_x should have no content/refcount after calling x.unref()21:32
sonney2k_other y=x ' should21:32
sonney2k_I guess we should make the ref function private to indicate that it is not necessary to use it21:33
sonney2k_from the outside21:33
sonney2k_only converters fail now21:36
sonney2k_all the rest is good21:36
-!- cwidmer [] has joined #shogun21:36
blackburnsonney2k_: yes yes will fix in a min21:36
sonney2k_well almost21:36
CIA-113shogun: Sergey Lisitsyn master * r67b73a2 / src/shogun/converter/LocallyLinearEmbedding.cpp : Fixed LLE/HLLE/LTSA crashers -
wikinghow the hell did i managed this: "pure virtual method called"21:48
-!- cronor [] has joined #shogun21:48
wikingif i use the kernel in an svm21:49
wikingand then on the end i call sg_unref on that kernel21:49
wikingi get this error21:49
wikingi must say21:50
wikingcan be applied21:50
wikingit works21:50
CIA-113shogun: Soeren Sonnenburg master * r9490ebe / (4 files in 4 dirs):21:50
CIA-113shogun: unref'ing SGReferenced data twice is OK now21:50
CIA-113shogun:  -
blackburnsonney2k_: need help there21:54
blackburnaccuracy =  0.8921:54
blackburnauROC =  1.021:54
sonney2k_or is this rounding error?21:54
sonney2k_like 0.99999921:54
blackburnsonney2k_: we have a bug in ROC21:54
sonney2k_ohhh no21:54
blackburnsonney2k_: I had a test for that21:55
blackburnlet me try to check if it still coincides21:55
blackburnsonney2k_: fully coincides21:56
sonney2k_what are the labels?21:56
blackburnsonney2k_: where?21:56
sonney2k_you calc acc/roc22:00
sonney2k_so you need two label objects22:00
blackburnsonney2k_: in case of 1.0 roc?22:01
sonney2k_blackburn, btw there are very few errors left in examples/libshogun22:01
sonney2k_wiking, if you want to have a look...22:02
wikingsonney2k_: testing it22:02
wikingi mean my example22:02
wikingjust doing the valgrinding22:02
blackburn 1.20814879 -1.20519035  1.18351523  1.30408547 -0.829074    0.3439955822:03
blackburn1. -1.  1.  1. -1. -122:03
sonney2k_what is ground truth?22:03
blackburnfirst 6 of result and ground truth22:03
sonney2k_acc cannot be 122:04
sonney2k_last is wrong22:04
blackburnsonney2k_: auroc?22:04
blackburnaccuracy is not 1.022:04
wikingvalgrind is giving me a headache with the bad frees22:04
sonney2k_wiking, I dont' have any now22:05
sonney2k_blackburn, but 0.8333 right?22:05
blackburnaccuracy =  0.8922:05
blackburnauROC =  1.022:05
wikingsonney2k_: same story as i've explained before... double freeing22:06
sonney2k_blackburn, but only 1/6th is wrong so 1-1/6=0.8322:06
blackburnsonney2k_: 400 labels22:06
sonney2k_wiking, are you sure you use latest git?22:06
blackburnI printed only 622:06
blackburnsonney2k_: ROC looks like22:06
blackburny axis22:06
wikingcommit 9490ebe98295796cf3a9fc5ed69b8c5dd6a6a35422:07
blackburnand x axis22:07
sonney2k_blackburn, sgvector related bug?22:07
sonney2k_is auroc now always 1.0?22:08
blackburnsonney2k_: now22:08
blackburnaccuracy =  0.812522:08
blackburnauROC =  0.92562522:08
blackburndifferent setting22:08
sonney2k_blackburn, and valgrind gives no errors?22:08
blackburnhmm let me try mldata script22:08
blackburnwith this ouputs22:08
wikingenough dense/short?22:12
sonney2k_wiking, obviously wrong22:17
sonney2k_and btw you can use init_shogun_with_defaults()22:17
sonney2k_then you don't need print msg etc22:17
wikingthis all was copied from an example...22:18
wikingi just densed it down...22:18
sonney2k_which example?22:18
sonney2k_fixing it there22:19
-!- cwidmer [] has quit [Quit: Konversation terminated!]22:21
CIA-113shogun: Soeren Sonnenburg master * r1306d2d / examples/undocumented/libshogun/classifier_multiclasslinearmachine.cpp : fix double free in example -
wikingsonney2k_: have u tried running that code with that fix?22:24
wikingsince it's going to fail22:24
blackburnsonney2k_: defeated!22:25
sonney2k_blackburn, what was it?22:26
blackburnsonney2k_: russia 2-0 germany @ IIHF world championship :P22:26
wikingsonney2k_: i meant your last fix for classifier_multiclasslinearmachine.cpp22:26
sonney2k_wiking, it works w/o valgrind errors here22:28
wiking[ERROR] assertion labels.vector && idx<get_num_labels() failed in file features/Labels.cpp line 23122:28
wikingif i apply that last patch22:28
blackburnsonney2k_: no bug in roc22:29
sonney2k_feel free to fix22:30
* sonney2k_ ZZZzzzzz....22:30
blackburnno bug!22:30
sonney2k_blackburn, tomorrow can you help w/ sgstringlist / sparse vector?22:30
sonney2k_blackburn, do you prefer to do SparseVector/Matirx22:31
sonney2k_or string/stringlist?22:31
blackburnstring is your expertise :)22:31
sonney2k_ok then I will do strings22:31
sonney2k_you do sparse22:31
sonney2k_and then we commit and break everything again22:31
blackburnokay time to finally fix isomap/mds22:32
sonney2k_blackburn, have a look at libshogun examples ... maybe you can fix the remaining err's22:32
blackburnI'll do22:32
sonney2k_wiking, and you too22:32
wikingsonney2k_: i know what's the error22:32
wikingi'll give you a pull req22:32
wikingnum_cols = number of vectors in the matrix right?22:34
CIA-113shogun: Viktor Gal master * r520fb0c / examples/undocumented/libshogun/classifier_multiclasslinearmachine.cpp :22:37
CIA-113shogun: Fix the number of vectors in the matrix22:37
CIA-113shogun:  -
wiking"pure virtual method called"22:47
-!- cronor [] has quit [Quit: cronor]22:48
wikingblackburn: ok so here comes a new example23:05
wikingand it'll work with kernel MC as well as soon as pluskid's pull req is applied23:08
wikinghahaha heiko got the credit for this example in the licensing but yeah it's alright ;)23:09
CIA-113shogun: Viktor Gal master * re5d216a / examples/undocumented/libshogun/evaluation_cross_validation_multiclass.cpp : Add Multiclass cross-validation example to libshogun -
wikingblackburn: aroud?23:34
wikingso i was debugging ./classifier_mklmulticlass23:35
wikingand realized in the error messages23:35
wikingthat get_name is not overridden23:35
wikingjust a sec23:35
wikingi'll give u a fix23:35
blackburnI can fix it23:36
blackburnto avoid moar pull requests :D23:36
wikingheheh nooo23:36
wikingi want my commits!23:36
blackburnheheh sure23:36
wikingbut the weirder error is that the kernel doesn't have lhs23:39
wikingafter train :(23:39
wiking[ERROR] MKLMulticlass: No left hand side specified :(23:41
blackburnwiking: yeap23:41
CIA-113shogun: Viktor Gal master * r82cc060 / src/shogun/classifier/mkl/MKLMulticlass.h : Override get_name in MKLMulticlass -
wikingstill debugging23:42
wikingblackburn: just for you: "You've got to love Vladimir Putin. Otherwise you die in an unexplained accident."23:47
blackburnhopefully no23:47
blackburnopposition guys play some crazy games with police right now23:48
wikingwtf can happen with this mkl's kernel23:53
--- Log closed Wed May 09 00:00:37 2012