Open in new window / Try shogun cloud
--- Log opened Mon Jul 25 00:00:12 2011
--- Day changed Mon Jul 25 2011
blackburnsonney2k: libsvm doesn't work neither in python nor java00:00
blackburnSystemError: [ERROR] assertion kernel->get_num_vec_lhs()==problem.l failed in file classifier/svm/LibSVM.cpp line 8500:00
@sonney2kI don't understand00:01
@sonney2kyou said the minimal example worked?00:01
@sonney2kthat should be using libsvm too00:01
blackburnminimal example yes, - not00:02
@sonney2kbut where is the difference?00:02
@sonney2kthey both do the same00:02
blackburnI would fix that already if I knew00:03
@sonney2kblackburn, I suspect that the number of labels doesn't match the matrix00:03
@sonney2kdoes python work?00:03
@sonney2kor java?00:03
@sonney2kor none?00:03
blackburnah yes00:04
blackburnmy fauld00:04
@sonney2kbtw the minimal example in java cannot work00:04
@sonney2kforget what I just said00:04
@sonney2kblackburn, your fault?00:05
@sonney2kwhat means ah yes?00:05
blackburnnumber of labels wrong00:05
blackburnsonney2k: why minimal example in java cannot work?00:06
@sonney2kdid you read what I said above?00:06
blackburnyes I did00:06
@sonney2k<sonney2k> btw the minimal example in java cannot work00:07
@sonney2k<sonney2k> forget what I just said00:07
blackburnsonney2k: libsvm produces the same results00:08
@sonney2kblackburn, as in same labels.get_labels() ?00:08
blackburnblackburn@blackburn-laptop:~/shogun/shogun/examples/undocumented/java_modular$ ./ classifier_libsvm_modular.java00:08
blackburn[0.1938791717197525, 0.19659259940936621]00:08
blackburnblackburn@blackburn-laptop:~/shogun/shogun/examples/undocumented/python_modular$ python classifier_libsvm_modular.py00:08
blackburn[ 0.19387917  0.1965926 ]00:08
@sonney2k2 outputs only?00:10
@sonney2kwe should have 9200:10
blackburnyes, I modified data00:10
blackburnI can't check 92 numbers for equality00:10
@sonney2kbetter check for the whole sample00:10
blackburntakes more time00:10
@sonney2kjust print them and do a diff00:10
@sonney2kyes I understand but it is impossible to have 92 numbers to match by chance00:11
blackburnokay okay00:11
@sonney2kohh dam'd I am compiling shogun on debian unstable00:11
@sonney2klots of new warnings....00:11
@sonney2kI think I should start preparing a debian package00:12
@sonney2kfor the new thing00:12
blackburnsonney2k: the same for 9200:13
blackburnok, i'm pretty tired with java today00:16
@sonney2kblackburn, what did you expect?00:16
@sonney2kI think it ran rather smoothly00:16
@sonney2kit basically worked out of the box00:16
blackburncan't understand what is you talking about00:17
@sonney2kdidn't you only have to modify /
@sonney2kno bugs in typemaps (so far)00:17
@sonney2kso yes - that is trivial compared to typemap bugs00:17
blackburnit is00:18
@sonney2kso we are lucky00:18
@sonney2kand chances are that other examples will just work00:18
blackburnI saw some strange outputs in some of them00:19
blackburnNaN or so00:19
blackburntomorrow will take a look00:19
@sonney2kI mean now it remains to test if string based typemaps work00:19
@sonney2ke.g. stringfeatures00:19
@sonney2kand then some really complex example with preprocessors attached or multiple kernels00:20
@sonney2kif something big works then the rest is just minor issues00:20
@sonney2kblackburn, I really think we need someone doing a tutorial with some nice data set00:24
@sonney2kI mean like we have a certain data set and no idea about it00:24
@sonney2kso we do pca or so first00:24
@sonney2kand visualize it00:24
@sonney2kthen we do some classification or so00:24
blackburnwith as much methods used as it could be?00:25
@sonney2klike some story line from very explorative unsupervised00:28
@sonney2kto simple supervised00:28
@sonney2ke.g. linear00:28
@sonney2kthen e.g. svm w/ kernels00:28
@sonney2kand then maybe even multiple kernels / data sources00:29
blackburngood idea00:29
@sonney2kthat could work for anything, 2-class classification, regression, multiclass00:30
@sonney2kwould be cool to use heikos x-validation on top already for that00:30
blackburnI would say I can do it if I wasn't embarassed with my manifold learning algos00:31
@sonney2kblackburn, don't worry at some point we will have shogun 1.0 and then we might have time to work on some nice applications too :)00:42
@sonney2kgoing to bed now00:42
blackburnsee you00:43
-!- blackburn [~blackburn@] has quit [Ping timeout: 255 seconds]00:49
-!- f-x [~user@] has joined #shogun01:59
-!- f-x_ [fx@] has joined #shogun04:02
-!- f-x [~user@] has quit [Ping timeout: 260 seconds]04:24
-!- in3xes [~in3xes@] has quit [Quit: Leaving]05:09
-!- gsomix [~gsomix@] has joined #shogun05:45
-!- [1]warpy [] has quit [Quit: HydraIRC -> <- Like it? Visit #hydrairc on EFNet]07:08
-!- gsomix [~gsomix@] has quit [Read error: Connection reset by peer]07:29
-!- f-x [~user@] has joined #shogun08:52
-!- sploving1 [~sploving@] has joined #shogun08:55
@sonney2ksploving1, could you please test on the kernel example first?09:11
sploving1I tested all of them09:12
sploving1there is no same result!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!09:12
@sonney2kwhy what?09:13
@sonney2kwhen you just set features and get the features09:13
sploving1the result are not same(python , lua)09:13
@sonney2kare they the same as in python?09:13
@sonney2ksploving1, yes you said that already - now we need to debug why not.09:14
sploving1sonne2y, why run twice not the same?09:14
@sonney2kthey were not the same in (python,java) either. but now they are at least for some examples09:15
@sonney2ksploving1, please first try a simpler example like just setting features / getting features09:15
sploving1you can give me the name of example, and I will try. and I do not know what setting/getting features mean09:17
@sonney2ksploving1, I see09:17
@sonney2klets try features_simple_real_modular.py09:17
sploving1works well~09:20
sploving1sonney2k, I mean in python it works well09:20
@sonney2ksploving1, ok - now compare if that works in lua too09:21
sploving1sonney2k, can we compile them both09:21
sploving1I mean in configure, we need compile them both.09:21
@sonney2ksploving1, yes, just configure python_modular,lua_modular09:22
sploving1./configure --interfaces=python_modular,lua_modular09:22
@sonney2kbut when you already installed you don't need to09:22
@sonney2kbetween yes09:22
sploving1okay that is good09:22
@sonney2ksploving1, I am looking into the averaged perceptron issue09:22
@sonney2kit is a different problem - it seems09:23
sploving1yeap. I do not need09:28
sploving11 4 009:29
sploving10 0 909:29
sploving10 0 009:29
sploving10 5 009:29
sploving10 0 609:29
sploving19 9 909:29
sploving1this is lua result09:29
sploving1[[ 1.  2.  3.]09:29
sploving1 [ 4.  0.  0.]09:29
sploving1 [ 0.  0.  0.]09:29
sploving1 [ 0.  5.  0.]09:29
sploving1 [ 0.  0.  6.]09:29
sploving1 [ 9.  9.  9.]]09:29
-!- sploving1 was kicked from #shogun by bettyboo [flood]09:29
-!- sploving1 [~sploving@] has joined #shogun09:29
sploving1[[ 1.  2.  3.]09:29
sploving1 [ 4.  0.  0.]09:29
sploving1 [ 0.  0.  0.]09:29
sploving1 [ 0.  5.  0.]09:29
sploving1 [ 0.  0.  6.]09:29
sploving1 [ 9.  9.  9.]]09:29
-!- sploving1 was kicked from #shogun by bettyboo [flood]09:29
-!- sploving1 [~sploving@] has joined #shogun09:29
sploving1[[ 1.  2.  3.]09:30
sploving1 [ 4.  0.  0.]09:30
sploving1 09:30
sploving1 [ 0.  0.  0.]09:30
sploving1 [ 0.  5.  0.]09:30
sploving1 [ 0.  0.  6.]09:30
sploving1 [ 9.  9.  9.]]09:30
sploving1this is python result09:30
sploving1sonney2k, which is correct?09:33
@sonney2ksploving1, what is the original input?09:34
sploving1a=RealFeatures(A), a.set_feature_vector(array([1,4,0,0,0,9], dtype=float64), 0) will affect a.get_feature_matrix()??09:34
sploving1matrix=array([[1,2,3],[4,0,0],[0,0,0],[0,5,0],[0,0,6],[9,9,9]], dtype=float64)09:34
sploving1this  is the original input09:34
sploving1sonney2k, in lua it is : matrix = {{1,2,3},{4,0,0},{0,0,0},{0,5,0},{0,0,6},{9,9,9}}09:35
-!- f-x [~user@] has quit [Ping timeout: 260 seconds]09:36
sploving1as I do not know features' meaning, I have no idea which result is correct09:36
@sonney2ksploving1, did you do the set_feature_vector in lua too?09:37
sploving1sonney2k, yeap . a:set_feature_vector({1,4,0,0,0,9}, 0)09:38
@sonney2ksploving1, I am a bit lost - please comment the set_feature_vector in both languages09:38
@sonney2kand then just show what you get in python (first) and then lua09:39
@sonney2ksploving1, maybe use for pasting...09:39
@sonney2kor /query me09:39
sploving1sonney2k, take a look at it!09:41
@sonney2kthe python one is correct09:42
sploving1you mean set_feature_vector has no effect on the result? sonney2k??09:43
sploving1sonney2k, then why set_features_vector effect the result in lua?? so strange!09:44
@sonney2ksploving1, it should not yes09:50
@sonney2ksploving1, please don't do set_feature_vector for now and check09:51
@sonney2kit is probably wrong in lua nevertheless09:51
@sonney2k(just result transposed - I guess different order in typemap)09:51
sploving1sonne2k, without it(set_..), it is the correct result09:53
sploving1it is the origin input09:54
sploving1the same with09:54
sploving1different order? sonney2k, can you expain it more detail?09:55
sploving1so I can fix it09:55
@sonney2ksploving1, yes that can happen when both set_feature_matrix and get_feature_matrix use a different ordering09:55
@sonney2ksploving1, when you uncomment09:56
@sonney2k    print a.get_num_vectors()09:56
@sonney2k    print a.get_num_features()09:56
@sonney2kwhat do these display in lua?09:56
@sonney2ksploving1, in the end i suspect that just this array[i * cols + j] statement in lua is wrong09:58
sploving13 609:58
@sonney2kthat is correct09:58
@sonney2ksploving1, if you write  matrix = {{1,2,3},{4,0,0},{0,0,0},{0,5,0},{0,0,6},{9,9,9}} how many tables are these?10:03
@sonney2k6 right?10:04
@sonney2kso that should match rows10:04
@sonney2kand cols is 3 since each table has 3 elements10:04
sploving16 yeap10:05
-!- f-x [~user@] has joined #shogun10:05
sploving16 is rows, 3 is cols10:05
@sonney2ksploving1, does that match the meaning you have in swig_typemaps.i?10:05
sploving1yeap. now I understand set_feature_vector mean10:05
sploving1it is to set the first columen10:05
sploving1I will take a look at the file and fix it10:06
@sonney2ksploving1, but please not that the vector set function is correct10:07
@sonney2kit must be the in /out typemap for SGMatrix that are *both* wrong10:07
sploving1you mean SGVector is correct and SGMatrix is wrong?10:07
sploving1sonney2k, okay10:07
@sonney2kI think it should be array[j * rows + i] in line 156 in swig_typemaps.i10:12
@sonney2kand same indexing in line 17610:13
@sonney2ksploving1, ^10:13
@sonney2kthen it should work10:13
sploving1I fixed it and now recompiling10:13
sploving1I thought shogun store in rows first, but that is wrong10:14
sploving1shogun store in colmn first then rows, sonney2k10:14
@sonney2kyes it is always column by column10:14
@sonney2klike fortran, matlab, r, octave, ...10:14
-!- blackburn [~blackburn@] has joined #shogun10:15
@sonney2k... but not python :)10:15
sploving1oh. I know10:15
@sonney2k(in python numpy one can specify that one wants fortran order - so it works there too :)10:16
sploving1sonney2k, now I want to support rubby narray10:17
@sonney2ksploving1, does it work now?10:17
@sonney2kI mean lua matrix?10:18
sploving1just compiling10:18
sploving1I fetch upstream10:18
sploving1so I git clean and compiling from new fresh10:18
sploving1I just see narray examples,but no api10:19
sploving1like numpy10:19
@sonney2kserialhex, in case you are around again ping us10:19
@sonney2ksploving1, I would do the same kind of low-level support that you did for lua10:20
sploving1sonne2k, ?10:20
sploving1what do you mean?10:20
@sonney2ksploving1, just using arrays10:22
-!- sploving1 [~sploving@] has quit [Remote host closed the connection]10:22
@sonney2khmmhh it looks like narray is still being developed10:24
@sonney2kso it is probably worth supporting10:24
@sonney2kand the api is in narray.h10:27
-!- sploving1 [~sploving@] has joined #shogun10:30
sploving1sonney2k,/../../src/interfaces/lua_modular/ undefined symbol: _ZN6shogun4CGMM10train_smemEiidid10:30
sploving1my machine crashed just now. I need reboot and just run make(cannot run other application) to compile shogun10:31
sploving1but I met the  problem: ua: error loading module 'modshogun' from file '../../../src/interfaces/lua_modular/':10:31
sploving1../../../src/interfaces/lua_modular/ undefined symbol: _ZN6shogun4CGMM10train_smemEiidid10:31
@sonney2ksploving1, yes it needs 1.5 G to compile10:31
sploving1I have git clean -dfx and configure it with lua/python modular10:33
@sonney2ksploving1, I am doing now too10:34
@sonney2ksploving1, btw you can use narray.h for the api of ruby's narray10:37
sploving1oh. i hope there is a api doc10:39
@sonney2ksploving1, I couldn't find any - but the .h does contain the needed info10:39
@sonney2ke.g. IsNArray() to test if the obj. is an narray10:39
sploving1okay. I will take a look at it10:40
@sonney2kand there is RNArray with the data10:40
sploving1if it is similar to python, that maybe not difficult10:40
sploving1numpy, i mean10:40
@sonney2kit is definitely similar and not beautiful :)10:42
@sonney2kwhich example did not work?10:43
@sonney2ksploving1, ^10:43
sploving1features_simple_real_modular.lua, sonney2k10:44
sploving1does fresh shogun work well in your computer??10:45
@sonney2kI just did git clean -dfx and recompiled10:45
@sonney2kit works....10:45
@sonney2khow do you run the example?10:46
sploving1export LUA_PATH=../../../src/interfaces/lua_modular/?.lua\;?.lua10:46
sploving1export LUA_CPATH=../../../src/interfaces/lua_modular/?.so10:46
sploving1then lua features_simple_real_modular.lua10:46
@sonney2k(I ran ./
@sonney2kyes that works too10:47
sploving1oh. I wll compile it again10:50
-!- sploving1 [~sploving@] has left #shogun []10:50
-!- warpyyy [~theuser@] has joined #shogun10:52
@sonney2kblackburn, does current master compile and run for you?10:52
-!- warpyyy [~theuser@] has quit [Read error: Connection reset by peer]10:54
blackburnsonney2k: yes, all ok, interfaces=java_modular10:57
@sonney2kok then it must be sth on splovings side11:00
* sonney2k is transitioning CLabels for SGVector11:01
CIA-87shogun: Soeren Sonnenburg master * re9d4632 / src/interfaces/lua_modular/swig_typemaps.i :11:47
CIA-87shogun: Merge pull request #232 from sploving/master11:47
CIA-87shogun: fix matrix typemap(columns first then rows) -
CIA-87shogun: Baozeng Ding master * rad8130a / src/interfaces/lua_modular/swig_typemaps.i : fix matrix typemap(columns first then rows) -
-!- sploving1 [~sploving@] has joined #shogun12:06
sploving1now lua features_simple_real works!12:07
sploving1sonney2k, do you know why reduce different results?12:08
@sonney2ksploving1, and kernel too?12:09
@sonney2ksploving1, I am working on that perceptron issue12:09
@sonney2ksince that issue appears in python too - it must be some general problem12:09
sploving1I am tring kernel now12:09
sploving1sonney2k, not the same. lua generate so long result!!!!12:16
@sonney2ksploving1, could you please test km_train first?12:18
@sonney2kit should be as big as number of columns12:18
@sonney2ktimes number of columns12:18
@sonney2kIIRC 92x9212:18
sploving1lua: km_train: 92*812:23
sploving1sonney2k, how to print python??12:23
sploving1it has ... in the result12:24
@sonney2kyou mean print(x) ?12:24
@sonney2ksploving1, how can km_train for lua be 92x8 ?12:25
@sonney2knot possible...12:25
sploving1sonney2k,  'numpy.ndarray' object has no attribute 'repr'12:25
sploving1sonney2k, what it should be? 92*92?12:26
sploving1sonney2k,, the python output still cannot dump using repr. it has ... output12:28
sploving1I mean it omit some results12:30
sploving1using repr, or print12:30
@sonney2ksploving1, then use numpy.savetxt('somefilename', km_train)12:31
@sonney2kbut that is not the actual problem...12:32
@sonney2kbtw the lua matrix is 92x92 here too12:34
sploving1this is the result12:35
sploving192* 92? how dou you know that??12:36
sploving1I just count rows = 92, cols = 8 about, sonney2k12:36
@sonney2knope 92x9212:40
@sonney2kand gives same result as in python btw12:42
sploving1sonney2k, you are so great. can you tell me how do you that????12:44
sploving1sonney2k, I did not see the right sitd12:45
@sonney2kI just write the output of the lua matrix to a file12:45
sploving1sorry for that12:45
sploving1you mean lua *.lua > 1.txt? sonney2k12:46
sploving1okay. I will move to ruby12:50
sploving1sonney2k I gtg bye12:51
@bettyboosee you12:51
@sonney2ksploving1, don't forget strings12:51
@sonney2kin lua I mean12:51
@sonney2kyou havent' tested these yet12:51
sploving1sonney2k, okay. I will test them. when output, just lua *.lua > 1.txt???12:52
@sonney2ksploving1, I don't understand what you mean12:52
sploving1in my machine, I did not see the right side of the columns12:52
sploving11 0.026404555080215 0.00051704925476818 0.013315592813501 0.98824215105277 0.65147692801705 0.14929587268109 2.354172745438e-05 0.2832383077822712:53
sploving1for instance i just see the first row is the above12:53
sploving1but when I paste them on pasbin, it have the right side12:53
sploving1so strange12:53
-!- sploving1 [~sploving@] has left #shogun []12:54
@sonney2kblackburn, why do you use m_labels in GaussianNB?13:08
@sonney2kI mean you only need this in train() or?13:09
blackburnI can't remember13:11
blackburnlet me look ;)13:11
blackburnsonney2k: m_labels is used only in train, yes13:12
@sonney2kok then I remove m_labels from the .h13:12
-!- heiko1 [~heiko@] has joined #shogun13:35
@sonney2kheiko1, hey ... you survived :)13:41
heiko1sonney2k, yes, but pain all over the body ;)13:43
heiko1how do you say muskelkater in english? :)13:43
blackburnhard weekend with gf? :D13:43
blackburnthe last words were 'girlfriend is waiting' IIRC :D13:44
heiko1oh, yes and that also :)13:44
heiko1and climbing13:44
* sonney2k sings la la la *VERY* *LOUDLY*13:44
heiko1and you guys? did you have a good weekend?13:45
@sonney2kblackburn had a lot of fun with java or coffee or so13:47
blackburnwith HLLE too13:47
@sonney2kI am currently transitioning labels to use SGVector for real (internally)13:48
-!- heiko1 [~heiko@] has quit [Ping timeout: 258 seconds]15:27
-!- heiko1 [~heiko@] has joined #shogun15:33
@sonney2kheiko1, do you know by heart where you recently added a SG_NOT_IMPLEMENTED?15:41
heiko1yes, copy_subset() of CFeatures15:44
heiko1called by KernelMachine::train15:44
heiko1an example fails?15:45
@sonney2kyeah serialization15:45
@sonney2kbut I don't think it was that one15:45
@sonney2kit is the gaussian kernel that is failing15:45
heiko1let me check15:45
heiko1compute feature vector of SimpleFeatures15:46
@sonney2kheiko1, yes that one!15:47
heiko1this method hjust returns NULL there15:47
heiko1has to be overridden or something15:47
heiko1should I remove the SG_NOTIMPLEMENTED?15:47
@sonney2kI don't know though why it is called15:47
heiko1i saw a call of it recently... i will check15:49
heiko1if no feature matrix is set15:50
@sonney2kmakes sense15:50
@sonney2kso it could be that serializaton has some chicken / egg problem15:50
@sonney2kthat kernel should be loaded but features are not yet there or so15:51
heiko1but if the SG_NOTIMPLEMENTED is removed15:51
heiko1NULL is returned there15:51
@sonney2kand the gaussian kernel does some operation in load_serializable_post15:51
@sonney2kto precompute some x_i^215:52
heiko1sonney2k, the fire-alarm just started howling here15:53
heiko1i will check whats happening15:53
@sonney2kheiko1, ok15:53
heiko1indeed, there is a fire15:55
heiko1cars ariving, i will go outside for a minute15:55
@sonney2kwho is burning?15:55
heiko1(probalby takes more)15:55
heiko1dont know15:55
blackburnit is what they call 'extreme programming'15:56
heiko1i am in 6th floor15:56
@sonney2kblackburn, your name is heiko1 program15:56
@sonney2kI see them all burning with black smoke15:57
@sonney2khope heiko1 manages to escape15:57
blackburnwhere do you see it?15:57
@sonney2klive tv of course ;-)15:58
blackburnjoke? ;)15:58
@sonney2kno of course not *eg*15:58
@sonney2kI guess this bug hunt here is driving me made16:01
@sonney2kwhich reminds me16:01
@sonney2kblackburn, how is java going along?16:02
blackburnsonney2k: currently working on HLLE16:02
@sonney2kblackburn, that is not fair16:02
@sonney2kyou can have fun16:02
@sonney2kI have to fix bugs16:02
blackburnI'm currently fixing bugs in HLLE :D16:03
@sonney2kyou mean it is semi-fun16:03
@sonney2khmmhh not sure I can live with that16:03
heiko1small fire16:04
heiko1wow 3 cars and police here16:04
heiko1but builing will not be evacuated16:04
@sonney2kheiko1, what happened?16:14
@sonney2kor what is going on?16:14
heiko1they are already gone,16:14
heiko1but I did not have the motivation to go down all 150 chair steps ;)16:15
heiko1probably nothing too scary16:15
@sonney2kah no elevator...16:15
@sonney2kyou could have climbed...16:15
heiko1yes, but I dont like being in the elevator16:15
heiko1yes, its possible here .)16:15
@sonney2kanyway heiko1 I made some progress on this here16:15
@sonney2kthe strange thing is that the feature object got loaded just fine16:16
@sonney2kand it pretends that it did also load the matrix16:16
@sonney2kbut for some reason not?!16:16
heiko1are this simple features?16:18
@sonney2kheiko1, I only noticed because I changed all of labels and thus lots of classifiers16:19
@sonney2kand so I recognized that some examples fail...16:19
heiko1is the load method of SimpleFfeatures called?16:20
@sonney2kenabling debug I see that that simple features are already loaded...16:21
@sonney2kheiko1, w/ debug on I see that16:45
@sonney2k[DEBUG] START LOADING CSGObject 'SimpleFeatures'16:45
@sonney2k[DEBUG] Loading parameter 'feature_matrix' of type 'Matrix<float64>'16:45
@sonney2k[DEBUG] DONE LOADING CSGObject 'SimpleFeatures' (0x3c973b0)16:45
@sonney2kbut then in kernel lhs=0x3c973b0 '(nil)' num_vec_fm=0 num_feat_fm=0 num_vec=20 num_feat=216:45
@sonney2kthe nil there corresponds to no feature matrix around16:45
@sonney2kand the num_vec/feat_fm =0 indicate that the matrix indeed did not get loaded16:46
heiko1strange :(16:48
@sonney2kheiko1, does the matrix stuff work at all?16:48
@sonney2klet me create a fool proof example16:48
heiko1what do you mean with matrix stuff?16:48
@sonney2kanswer is no16:50
@sonney2kfrom modshogun import *16:50
@sonney2kfrom numpy import array16:50
@sonney2kfstream = SerializableAsciiFile("foo.asc", "w")16:50
@sonney2kbut in foo.asc we have feature_matrix Matrix<float64> 0 0 ()16:51
heiko1so which part is not working? save or load?16:52
@sonney2kheiko1, vector works though16:52
@sonney2kfstream = SerializableAsciiFile("foo2.asc", "w")16:52
@sonney2klabels Vector<float64> 3 ({1}{2}{3})16:52
@sonney2kheiko1, do you have an idea where I should look for the bug?16:53
@sonney2kor do you even know what the problem could be?16:53
heiko1i mean the save code of SimpleFeatures is short.16:54
heiko1writer->f_write(feature_matrix, num_features, num_vectors);16:54
@sonney2kheiko1, not the save code of simplefeatures16:54
heiko1ah sorry16:54
@sonney2klike where you did add support for SGVector / SGMatrix :)16:54
heiko1oh, ... mmh, perhaps the add methods of Parameter I did are wrong16:55
@sonney2kor not - we have to find out16:55
@sonney2kthat was in shogun/base/Parameter.cpp?16:56
heiko1all the add methods16:56
heiko1with SGVector SGMatrix16:56
heiko1hope theres no mistake in them16:57
@sonney2kbut we are not even using SGMatrix etc16:57
heiko1then this cant be the mistake16:57
@sonney2kwe use the add_matrix stuff16:57
heiko1but this wasnt touched recently or?16:58
@sonney2kmaybe for the subsetting business16:59
@sonney2kheiko1, I mean there are feature_matrix_num_vectors etc16:59
@sonney2kand these are 0 too17:00
heiko1I just got an idea17:00
heiko1let me check17:00
@sonney2kand indeed they are17:00
-!- in3xes [~in3xes@] has joined #shogun17:00
heiko1perhaps this has to do something with the variables that i changed17:01
heiko1the add methods have also been changed17:01
heiko1At sometime I replaced the features by a SGVector17:01
@sonney2kheiko1, no I think it is a bug in simplefeatures somehow17:01
heiko1but undid this17:01
heiko1in SimpleFeatures17:01
@sonney2kI mean dimensions of feature matrix need to be non-zero17:02
@sonney2kcould very well be my fault too...17:02
blackburnhooray to new heisenbug in arpack wrapper!17:02
heiko1sonney2k, I have an appointment in a few minutes, sorry for that, but I will be back later17:02
@sonney2kheiko1, found the bug!17:03
@sonney2kheiko1, in the set_feature_matrix for SGMatrix type17:04
@sonney2kit was forgotten to set featuer_matrix_num_vectors17:05
@sonney2keverywhere else it was ok17:05
heiko1glad you found it :)17:05
heiko1so, see you this in the evening17:06
-!- in3xes [~in3xes@] has quit [Ping timeout: 276 seconds]17:16
-!- in3xes [~in3xes@] has joined #shogun17:28
CIA-87shogun: Soeren Sonnenburg master * rd85cb65 / (23 files in 9 dirs):17:35
CIA-87shogun: remove unused confidences from labels and add SGVector in methods17:35
CIA-87shogun: utilizing labels when possible -
CIA-87shogun: Soeren Sonnenburg master * rd2f13fb / examples/undocumented/python_modular/ : add example to just serialize matrix -
CIA-87shogun: Soeren Sonnenburg master * r58fd62c / (10 files in 7 dirs):17:35
CIA-87shogun: various bugfixes related to SGVector/SGMatrix transition17:35
CIA-87shogun: - in out typemap of python_modular for vectors17:35
CIA-87shogun: - in simplefeatures set_feature_matrix17:35
CIA-87shogun: - KRR double free17:35
CIA-87shogun:  ... -
-!- CIA-87 was kicked from #shogun by bettyboo [flood]17:35
-!- CIA-87 [] has joined #shogun17:35
@sonney2kblackburn, yay!17:36
@sonney2kall examples work again :)17:36
@sonney2kpython_modular only of course17:37
CIA-87shogun: Soeren Sonnenburg master * r00d57d5 / examples/undocumented/python_modular/ : fix clustering example -
CIA-87shogun: Sergey Lisitsyn master * rbcb7bb8 / (2 files): Added DORGQR routine wrapper for lapack -
CIA-87shogun: Sergey Lisitsyn master * r988360c / (5 files in 2 dirs): Introduced Hessian Locally Linear Embedding preprocessor -
CIA-87shogun: Sergey Lisitsyn master * r179091b / (2 files): Fixed LLE and added HLLE python modular example -
blackburnsonney2k: vodka?18:42
-!- gsomix [~gsomix@] has joined #shogun19:25
gsomixhi all19:25
-!- f-x [~user@] has quit [Remote host closed the connection]19:28
gsomixsonney2k, i saw own ohloh account. I think i did something wrong with commiting. :)19:30
gsomixAll Languages. Total Lines Changed: 221,575.19:34
heiko1sonney2k, are you there?20:06
@sonney2kheiko1, yes20:47
@sonney2kheiko1, wassup?20:47
heiko1did you receive my email?20:48
heiko1I want to create StringFeatures20:48
heiko1but it does not work20:48
heiko1because CAlphabet::check_alphabet() fails20:48
heiko1ALPHABET does not contain all symbols in histogram20:48
heiko1and I am a bit unsure, what I am doing wrong here20:49
heiko1the program creates some random char strings (this works, they are printed) and then creates a CStringFeatures instance20:49
heiko1and then SG_ERROR20:49
CIA-87shogun: Soeren Sonnenburg master * rfe563a7 / src/shogun/features/Alphabet.h :20:51
CIA-87shogun: fix documentation for ALPHANUM and PROTEIN alphabets (they take upper20:51
CIA-87shogun: case chars not lowercase) -
@sonney2kheiko1, ^20:51
@sonney2kUPPER CASE20:52
@sonney2kso 0x41 and more20:52
heiko1oh no :)20:53
heiko1silly mistake20:53
@sonney2kheiko1, well documentation was wrong...20:53
@sonney2kit said a-z20:53
@sonney2knot A-Z ...20:53
@sonney2kblackburn, vodka!20:53
blackburnnot yet! found error :D20:54
@sonney2kblackburn, you definitely need that to start working on java :D20:54
blackburnwith arpack the solution is kinda wrong20:54
heiko1sonney2k, well however, thanks .)20:54
blackburntemporary will force to use lapack20:54
@sonney2kgsomix, sounds like it...20:54
@sonney2kheiko1, IIRC there is some debug option - then it will display the histogram...21:05
heiko1sonney2k, works now :)21:05
CIA-87shogun: Sergey Lisitsyn master * r9ff9954 / (3 files): Added force_lapack option for Locally Linear Embedding and forced HLLE to use lapack solver -
@sonney2kheiko1, surprise ;-)21:06
heiko1sonney2k, I will try to generalise the model selecttion parameters now21:06
heiko1was trying to do model selection with a string kernel21:06
heiko1but it has int32_t type parameters21:06
heiko1that does not work yet21:06
@sonney2kheiko1, so you do all the other standard types like int / byte etc right?21:06
@sonney2kshould be easy given that you have double working already21:07
@sonney2kenums can be a bit problematic though - but in the end these are ints too21:07
heiko1i am not that deep into generics21:09
heiko1hope it is possible to append an instance of one modelparam to another with another type21:09
@sonney2kheiko1, enums are usually represented as integers - so at least from C/C++ / python it will work21:11
@sonney2kone could specify illegal values though - but that should be catched anyways21:12
heiko1uuh this is harder than i thought21:23
heiko1all this generic classes build trees21:23
heiko1I dont know if it is possible to have a datastructure that holds instances of generic classes with different types?21:23
heiko1no, its not21:25
heiko1sonney2k, basically this problem:21:29
@sonney2kyes indeed that is not working21:32
@sonney2kheiko1, but can't you define all the types / trees that you need?21:32
heiko1but one tree may have different node types21:32
@sonney2kI mean there are only a handful21:32
heiko1like one parameter int and another float21:33
heiko1both children of one node21:33
@sonney2kwhat you could certainly do is to store the node types21:34
heiko1I think I will have to do that21:35
@sonney2kand then have some access function that gives you the correct item from a union or so based on that type21:35
@sonney2kor you have a node content base class21:36
@sonney2kand then have derived classes for each node content type21:37
@sonney2kbut same thing not type safe21:37
@sonney2kyou need to store the type again21:37
heiko1I think I will just store the type and save the data with void pointers21:38
@sonney2kconsidering that we need just int, byte, bool - it is probably easiest to just use a union21:38
@sonney2kor that yes21:39
@sonney2kblackburn, like java examples SCNR :D21:39
blackburnit drives me mad21:40
blackburnI thought HLLE does21:40
blackburnbut it is not21:40
blackburnI guess I should write sth like "If embedding is shity try another data" in description21:41
blackburnsonney2k: you want me doing some notfunny things, ain't you?21:50
* blackburn wonders if there will be some ModifiedHessianLocallyLinearEmbedding21:52
blackburnor even ImplicitlyRestartedModifiedStableHessianLocallySublinearMegaEmbedding21:53
blackburnHLLE rocks!22:04
blackburnsonney2k: I guess you might understand how good it is:
blackburndamn that guy looks just like me :D22:10
-!- in3xes [~in3xes@] has quit [Quit: Leaving]22:15
-!- gsomix [~gsomix@] has quit [Ping timeout: 240 seconds]22:26
CIA-87shogun: Sergey Lisitsyn master * r053c634 / .gitignore : Added some more filetypes to ignore by git -
CIA-87shogun: Sergey Lisitsyn master * rac722b7 / src/shogun/mathematics/arpack.cpp : Improved arpack wrapper -
CIA-87shogun: Sergey Lisitsyn master * r1fff667 / src/shogun/preprocessor/LocallyLinearEmbedding.cpp : Improved stability of Locally Linear Embedding -
CIA-87shogun: Sergey Lisitsyn master * re11c4bc / src/shogun/preprocessor/HessianLocallyLinearEmbedding.cpp : Improved HLLE -
CIA-87shogun: Sergey Lisitsyn master * r05427ed / src/shogun/preprocessor/LocallyLinearEmbedding.cpp : Changed solver for LLE and removed unnecessary default parameter -
-!- blackburn [~blackburn@] has left #shogun []23:42
--- Log closed Tue Jul 26 00:00:21 2011