Open in new window / Try shogun cloud
--- Log opened Mon Nov 28 00:00:59 2011
-!- blackburn [~blackburn@] has quit [Quit: Leaving.]02:24
-!- in3xes [~in3xes@] has joined #shogun06:35
-!- blackburn [~blackburn@] has joined #shogun08:34
sonne|workmoin blackburn08:44
sonne|workI finally understood the typemap issue08:44
sonne|worktypecheck typemaps are only called when there is ambiguity08:44
blackburnand what is called if no?08:44
sonne|worke.g. if there are 2 functions named e.g. set_vector(float64_t*,int32_t)  / set_vector(SGVector<> v)08:45
sonne|workif not then the function is *always* called08:45
sonne|workso it needs to check the type too08:45
sonne|workthat's all08:45
sonne|workI will rework the typemaps hopefully tonight08:45
blackburnanyway we should remove float64_t*, int32_t stuff08:46
blackburnsonne|work: I ran tests yesterday, did HMM fail before?08:46
sonne|workblackburn: the problem only occurs when the float64_t* stuff is *not* there08:47
blackburnsonne|work: heh I see08:47
sonne|workblackburn: re HMM I don't remember08:47
sonne|workyou need to test it08:47
sonne|workbtw regarding your valgrind question08:47
sonne|workyou can use valgrind surpressions08:48
blackburnsonne|work: that is not about supression but necessarity of testing with valgrind08:48
sonne|workthere used to be a suppression file somewhere in /usr/share/doc/python*08:48
sonne|workthen you will only see shogun errors not the python ones08:49
sonne|work(which are in fact false alarms)08:49
blackburni.e. if there is grep shogun then something goes wrong08:49
blackburnsonne|work: IIRC I would need to recompile my python to use it, it is not the problem I describe08:50
sonne|workno you don't need to recompile python for that08:50
sonne|workit is just a list of errors to ignore08:50
sonne|work...that you pass to valgrind08:50
blackburnsonne|work: I would like you to test HMM cause I really don't know how to use it08:51
blackburnthe results are different from ones we have now08:52
sonne|workdid the tests fail in 1.0.0 ?08:52
blackburnsonne|work: ah do you suggest to test it with 1.0.0?08:52
blackburnhmm why not ok08:53
sonne|workI mean then we know whether we introduced this problem in 1.1...08:53
blackburnsonne|work: I started playing with SVMs and noticed that classifier/svm is like a dark forest08:54
blackburnso much 'unstructured' code08:55
sonne|workwhat do you mean?08:55
blackburnwell there are a lot of files with strange names08:56
sonne|workyou mean like SVM, LDA, PCA, :D08:56
blackburnno codestyle in sources08:56
blackburni.e. some of them are uppercase08:56
blackburnsome are lower08:56
sonne|workplease be more specific08:57
sonne|workor give me an example08:57
blackburnsonne|work: the code is impossible to maintain08:57
blackburnat least for me08:57
sonne|workwhich code?08:57
blackburnwhat if there is an error?08:57
blackburnany svm08:57
blackburnsonne|work: I have no idea how to maintain it properly08:59
sonne|workyou mean the deep internals of svm algo's?08:59
sonne|workyes that is impossible for non-experts09:00
sonne|workthat is why everyone uses e.g. libsvm's code as black box09:01
sonne|workwell ok I did a couple of changes to libsvm but tested them09:01
sonne|workand you can only do it when you have read their paper...09:01
blackburnsonne|work: hmm I think it is the same for dimreduction and you09:07
blackburnmy code is as well non comprehensive09:07
sonne|workexactly :)09:09
sonne|workit takes a *long* time to understand it...09:10
blackburnsonne|work: especially in the case of LLE09:13
blackburnI did it with alignment and it is not as in the paper09:13
blackburnthat's why it $\infty$ times faster than any other impl09:14
blackburnI wonder, sonne|work did you ever heard anything about elections in russia this week?09:15
blackburnI think nobody even knows what is going one cause nobody cares :)09:16
blackburnbut still interested09:16
sonne|workwhat happened?09:19
-!- in3xes [~in3xes@] has quit [Quit: Leaving]09:22
blackburnsonne|work: well we all know who will be elected :D09:29
blackburngovernment party cheats way too much..09:30
sonne|worklike everywhere...09:34
blackburnsonne|work: you can't imagine how much09:34
blackburnsonne|work: we know who will be president for 4 years already09:34
blackburnsonne|work: yes, there is an regression with HMM between 1.0.0 and 1.1.010:21
-!- blackburn [~blackburn@] has quit [Quit: Leaving.]10:32
-!- blackburn [5bdfb203@gateway/web/freenode/ip.] has joined #shogun12:40
blackburnhah my new lcd screen for the notebook has finally arrived12:46
blackburn67 days in the loong way home :D12:46
blackburncrazy, it took 67 days to move from GB to Russia12:47
sonne|workblackburn: congrats12:54
blackburnsonne|work: have 3 mins?12:55
sonne|workblackburn: regarding the HMM regression - could you do a git bisect to figure out the commit that is causing the trouble?12:55
blackburnsonne|work: sure, this night12:55
blackburnsonne|work: what is the multiclass SVM you would suggest to use?12:55
sonne|workask but I may have to leave12:55
blackburnI used larank12:56
sonne|work(true multiclass)12:56
blackburnis it worse?12:56
sonne|workone is online one not12:56
blackburnGMNPSVM looks to took infinity time12:56
sonne|worklarank can be faster on more data12:56
blackburnwill it produce better accuracy?12:56
sonne|workhow many classes /data points12:56
sonne|workI would expect so12:56
blackburn43 classes, 200-600 each12:57
blackburnLibSVMMultiClass was slightly worse, 84% accuracy12:57
blackburnand with LaRank I got ~87%12:57
sonne|workprecompute kernel matrix?12:57
sonne|workok but then you are already pretty good12:57
blackburnsonne|work: gaussian on HOG  :)12:57
blackburncan I expect any other impact on accuracy with GMNPSVM?12:58
blackburnit is pretty good indeed but others got almost 99 with convolutional neural networks12:59
blackburnI would like to beat them hah12:59
blackburnI guess you had to leave :)13:01
sonne|workwell convolutional NN are very hard to beat13:09
sonne|workfor such obj. recognition tasks13:09
blackburnsonne|work: aren't they overfitted?13:10
blackburnare they really so good?13:10
sonne|workthey can be when done right13:12
sonne|workalright back to work13:12
blackburnhmm I'm in doubts13:12
blackburnI guess I have to try CNNs..13:28
sonne|workI would be very interested in the results13:35
sonne|workbtw, did you use larank from shogun?13:35
sonne|workdid you normalize your data?13:35
sonne|workdid you add virtual examples (by rotating/shifting objects?)13:35
sonne|workscaling etc13:35
sonne|workall these things help13:35
blackburnsonne|work: me too, from shogun, normalized but didnot extract RoI, no13:46
blackburnsonne|work: I don't like any neural nets, what about you? :)14:10
sonne|workme neither14:11
sonne|worktoo many local minima14:11
sonne|workvery hard to control14:11
blackburnI don't have much SVM experience14:12
blackburnbut it seems to be better14:12
sonne|workwell it is a convex optimizationproblem so local minima == global minima14:19
sonne|workit helps a lot when you know that the same choice of model parameters leads to the same result14:20
blackburnas I understand14:21
blackburnSVM will have similar results on slightly different data14:21
blackburnbut NN could get totally wrong14:21
blackburnlike ill-posed things in linalg14:21
sonne|worklike trying to find the root's of a polynomial with *many* variables14:28
blackburntime to go to gym :)16:24
-!- blackburn [5bdfb203@gateway/web/freenode/ip.] has quit [Quit: Page closed]16:24
-!- blackburn [~blackburn@] has joined #shogun19:07
-!- mode/#shogun [+o sonney2k] by ChanServ19:52
blackburn7 steps..21:32
blackburn3 left22:20
blackburnI failed to determine the reason22:54
--- Log closed Tue Nov 29 00:00:59 2011