sonne|work blackburn sonne|work --- Log opened Mon Nov 28 00:00:59 2011 -!- blackburn [~blackburn@83.234.54.62] has quit [Quit: Leaving.] 02:24 -!- in3xes [~in3xes@180.149.49.230] has joined #shogun 06:35 -!- blackburn [~blackburn@188.168.4.251] has joined #shogun 08:34 moin blackburn 08:44 hi 08:44 I finally understood the typemap issue 08:44 typecheck typemaps are only called when there is ambiguity 08:44 oh 08:44 and what is called if no? 08:44 e.g. if there are 2 functions named e.g. set_vector(float64_t*,int32_t)  / set_vector(SGVector<> v) 08:45 if not then the function is *always* called 08:45 hmm 08:45 so it needs to check the type too 08:45 that's all 08:45 I will rework the typemaps hopefully tonight 08:45 anyway we should remove float64_t*, int32_t stuff 08:46 sonne|work: I ran tests yesterday, did HMM fail before? 08:46 blackburn: the problem only occurs when the float64_t* stuff is *not* there 08:47 sonne|work: heh I see 08:47 blackburn: re HMM I don't remember 08:47 you need to test it 08:47 btw regarding your valgrind question 08:47 you can use valgrind surpressions 08:48 sonne|work: that is not about supression but necessarity of testing with valgrind 08:48 there used to be a suppression file somewhere in /usr/share/doc/python* 08:48 then you will only see shogun errors not the python ones 08:49 (which are in fact false alarms) 08:49 i.e. if there is grep shogun then something goes wrong 08:49 sonne|work: IIRC I would need to recompile my python to use it, it is not the problem I describe 08:50 no you don't need to recompile python for that 08:50 it is just a list of errors to ignore 08:50 ...that you pass to valgrind 08:50 sonne|work: I would like you to test HMM cause I really don't know how to use it 08:51 the results are different from ones we have now 08:52 did the tests fail in 1.0.0 ? 08:52 sonne|work: ah do you suggest to test it with 1.0.0? 08:52 hmm why not ok 08:53 I mean then we know whether we introduced this problem in 1.1... 08:53 sonne|work: I started playing with SVMs and noticed that classifier/svm is like a dark forest 08:54 so much 'unstructured' code 08:55 what do you mean? 08:55 well there are a lot of files with strange names 08:56 you mean like SVM, LDA, PCA, :D 08:56 no codestyle in sources 08:56 i.e. some of them are uppercase 08:56 some are lower 08:56 etc 08:56 please be more specific 08:57 or give me an example 08:57 sonne|work: the code is impossible to maintain 08:57 at least for me 08:57 which code? 08:57 what if there is an error? 08:57 any svm 08:57 sonne|work: I have no idea how to maintain it properly 08:59 you mean the deep internals of svm algo's? 08:59 yes 08:59 yes that is impossible for non-experts 09:00 that is why everyone uses e.g. libsvm's code as black box 09:01 well ok I did a couple of changes to libsvm but tested them 09:01 and you can only do it when you have read their paper... 09:01 sonne|work: hmm I think it is the same for dimreduction and you 09:07 my code is as well non comprehensive 09:07 :D 09:07 exactly :) 09:09 it takes a *long* time to understand it... 09:10 sonne|work: especially in the case of LLE 09:13 I did it with alignment and it is not as in the paper 09:13 that's why it $\infty$ times faster than any other impl 09:14 I wonder, sonne|work did you ever heard anything about elections in russia this week? 09:15 I think nobody even knows what is going one cause nobody cares :) 09:16 but still interested 09:16 no 09:19 what happened? 09:19 -!- in3xes [~in3xes@180.149.49.230] has quit [Quit: Leaving] 09:22 sonne|work: well we all know who will be elected :D 09:29 government party cheats way too much.. 09:30 like everywhere... 09:34 sonne|work: you can't imagine how much 09:34 sonne|work: we know who will be president for 4 years already 09:34 sonne|work: yes, there is an regression with HMM between 1.0.0 and 1.1.0 10:21 -!- blackburn [~blackburn@188.168.4.251] has quit [Quit: Leaving.] 10:32 -!- blackburn [5bdfb203@gateway/web/freenode/ip.91.223.178.3] has joined #shogun 12:40 hah my new lcd screen for the notebook has finally arrived 12:46 67 days in the loong way home :D 12:46 crazy, it took 67 days to move from GB to Russia 12:47 blackburn: congrats 12:54 sonne|work: have 3 mins? 12:55 blackburn: regarding the HMM regression - could you do a git bisect to figure out the commit that is causing the trouble? 12:55 sonne|work: sure, this night 12:55 sonne|work: what is the multiclass SVM you would suggest to use? 12:55 ask but I may have to leave 12:55 GMNPSVM 12:55 I used larank 12:56 (true multiclass) 12:56 is it worse? 12:56 one is online one not 12:56 GMNPSVM looks to took infinity time 12:56 larank can be faster on more data 12:56 sure 12:56 will it produce better accuracy? 12:56 how many classes /data points 12:56 I would expect so 12:56 43 classes, 200-600 each 12:57 LibSVMMultiClass was slightly worse, 84% accuracy 12:57 and with LaRank I got ~87% 12:57 precompute kernel matrix? 12:57 ok but then you are already pretty good 12:57 sonne|work: gaussian on HOG  :) 12:57 can I expect any other impact on accuracy with GMNPSVM? 12:58 it is pretty good indeed but others got almost 99 with convolutional neural networks 12:59 I would like to beat them hah 12:59 I guess you had to leave :) 13:01 well convolutional NN are very hard to beat 13:09 for such obj. recognition tasks 13:09 sonne|work: aren't they overfitted? 13:10 are they really so good? 13:10 http://yann.lecun.com/exdb/mnist/ 13:11 they can be when done right 13:12 alright back to work 13:12 ok 13:12 hmm I'm in doubts 13:12 I guess I have to try CNNs.. 13:28 I would be very interested in the results 13:35 btw, did you use larank from shogun? 13:35 did you normalize your data? 13:35 did you add virtual examples (by rotating/shifting objects?) 13:35 scaling etc 13:35 all these things help 13:35 sonne|work: me too, from shogun, normalized but didnot extract RoI, no 13:46 sonne|work: I don't like any neural nets, what about you? :) 14:10 me neither 14:11 too many local minima 14:11 very hard to control 14:11 I don't have much SVM experience 14:12 but it seems to be better 14:12 well it is a convex optimizationproblem so local minima == global minima 14:19 it helps a lot when you know that the same choice of model parameters leads to the same result 14:20 as I understand 14:21 SVM will have similar results on slightly different data 14:21 but NN could get totally wrong 14:21 right? 14:21 like ill-posed things in linalg 14:21 like trying to find the root's of a polynomial with *many* variables 14:28 time to go to gym :) 16:24 -!- blackburn [5bdfb203@gateway/web/freenode/ip.91.223.178.3] has quit [Quit: Page closed] 16:24 -!- blackburn [~blackburn@188.168.5.8] has joined #shogun 19:07 -!- mode/#shogun [+o sonney2k] by ChanServ 19:52 7 steps.. 21:32 3 left 22:20 damn 22:54 I failed to determine the reason 22:54 --- Log closed Tue Nov 29 00:00:59 2011