Open in new window / Try shogun cloud
--- Log opened Sat Feb 23 00:00:49 2013
shogun-notifier-shogun: Sergey Lisitsyn :master * 7421ea5 / src/shogun/lib/tapkee/ (3 files): https://github.com/shogun-toolbox/shogun/commit/7421ea58a264f33fd8b271477f3548b19147e49800:05
shogun-notifier-shogun: Updates for tapkee00:05
shogun-notifier-shogun: Sergey Lisitsyn :master * d91f97e / src/shogun/lib/tapkee/routines/locally_linear.hpp: https://github.com/shogun-toolbox/shogun/commit/d91f97eafc85281a638bdd8a67f9fbd79588f01800:07
shogun-notifier-shogun: Changed eigen old define to tapkee internal define00:07
shogun-buildbot_build #867 of deb1 - libshogun is complete: Failure [failed compile]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/deb1%20-%20libshogun/builds/867  blamelist: Sergey Lisitsyn <lisitsyn.s.o@gmail.com>00:08
-!- wiking [~wiking@info2k1.hu] has quit [Quit: leaving]00:11
shogun-buildbot_build #868 of deb1 - libshogun is complete: Failure [failed compile]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/deb1%20-%20libshogun/builds/868  blamelist: Sergey Lisitsyn <lisitsyn.s.o@gmail.com>00:12
shogun-notifier-shogun: Sergey Lisitsyn :master * 28c5f41 / src/shogun/lib/tapkee/routines/methods_traits.hpp: https://github.com/shogun-toolbox/shogun/commit/28c5f412fbae754d2a827264af96cf3170e3b26301:02
shogun-notifier-shogun: Added missed method traits01:02
shogun-buildbot_build #869 of deb1 - libshogun is complete: Success [build successful]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/deb1%20-%20libshogun/builds/86901:15
-!- travis-ci [~travis-ci@ec2-54-242-202-157.compute-1.amazonaws.com] has joined #shogun01:18
travis-ci[travis-ci] it's Sergey Lisitsyn's turn to pay the next round of drinks for the massacre he caused in shogun-toolbox/shogun: http://travis-ci.org/shogun-toolbox/shogun/builds/499393401:18
-!- travis-ci [~travis-ci@ec2-54-242-202-157.compute-1.amazonaws.com] has left #shogun []01:18
shogun-notifier-shogun: Sergey Lisitsyn :master * 2ba8bb6 / src/shogun/lib/tapkee/ (7 files): https://github.com/shogun-toolbox/shogun/commit/2ba8bb6338b172145ccacb501c57be2dbc70aa3901:19
shogun-notifier-shogun: Added projecting functions capabilities in tapkee01:19
blackburnoh gosh 5 hours of that stuff01:20
shogun-buildbot_build #227 of ubu1 - libshogun is complete: Failure [failed compile]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/ubu1%20-%20libshogun/builds/227  blamelist: Sergey Lisitsyn <lisitsyn.s.o@gmail.com>01:21
-!- travis-ci [~travis-ci@ec2-54-242-202-157.compute-1.amazonaws.com] has joined #shogun01:28
travis-ci[travis-ci] it's Sergey Lisitsyn's turn to pay the next round of drinks for the massacre he caused in shogun-toolbox/shogun: http://travis-ci.org/shogun-toolbox/shogun/builds/499411101:29
-!- travis-ci [~travis-ci@ec2-54-242-202-157.compute-1.amazonaws.com] has left #shogun []01:29
shogun-buildbot_build #228 of ubu1 - libshogun is complete: Failure [failed compile]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/ubu1%20-%20libshogun/builds/228  blamelist: Sergey Lisitsyn <lisitsyn.s.o@gmail.com>01:30
shogun-buildbot_build #556 of cyg1 - libshogun is complete: Success [build successful]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/cyg1%20-%20libshogun/builds/55601:31
-!- n4nd0 [53b32c87@gateway/web/freenode/ip.83.179.44.135] has quit [Quit: Page closed]01:46
shogun-buildbot_build #826 of deb3 - modular_interfaces is complete: Failure [failed test ruby_modular]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/deb3%20-%20modular_interfaces/builds/826  blamelist: Sergey Lisitsyn <lisitsyn.s.o@gmail.com>01:53
-!- FSCV [~FSCV@187.210.54.166] has quit [Quit: Leaving]04:00
shogun-buildbot_build #291 of nightly_default is complete: Success [build successful]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/nightly_default/builds/29104:04
-!- shogun-notifier- [~irker@7nn.de] has quit [Quit: transmission timeout]04:19
-!- hoijui [~hoijui@dslb-092-078-043-220.pools.arcor-ip.net] has joined #shogun08:18
-!- hoijui [~hoijui@dslb-092-078-043-220.pools.arcor-ip.net] has quit [Ping timeout: 260 seconds]10:12
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun10:19
n4nd0blackburn: good morning10:29
n4nd0I have just profiled isomap10:30
blackburnn4nd0: hey10:30
n4nd0checking the output now with kcachegrind10:30
n4nd0do you want to take a look to the file?10:30
blackburnoh cool, tell me what is the bottleneck10:30
blackburn:)10:30
blackburncallgrind you mean?10:30
blackburnnot really I can make it too - just let me know what is slow10:30
n4nd0I cehck it with kcachegrind10:31
n4nd0let me some time to analyze then :)10:31
n4nd0blackburn: compute_shortest_distances_matrix is where the program spends the most time10:34
n4nd0but that is sort of obvious I think10:34
blackburnn4nd0: better surprise me10:34
blackburn:D10:34
blackburnn4nd0: actually sklearn implements that via cython10:34
blackburnso no matter what we do it would be sort of similar10:35
n4nd0blackburn: do you think we can do something smart there to make it faster?10:35
blackburnn4nd0: some quantum computing10:35
n4nd0??10:35
blackburnn4nd0: but to be serious no, it seems we can't10:36
blackburnmay be some microoptimizations10:36
n4nd0Eigen::redux_impl is the other part where the time is spent10:37
blackburnn4nd0: not sure what it is10:39
n4nd0blackburn: something that is called most of the times from compute_shortest_distances10:39
blackburnn4nd0: may be that's a distance callback10:39
n4nd0almost 10 million times for 1000 data point swissroll10:40
n4nd0it is sth from Eigen::internal so I am not sure what it is either10:40
n4nd0brb10:40
-!- trtr3434 [~trtr3434@sta-104-34.tm.net.my] has joined #shogun10:41
-!- trtr3434 [~trtr3434@sta-104-34.tm.net.my] has left #shogun []10:41
blackburnalmost 10 million times is surely distance10:42
n4nd0blackburn: so it should be probably something called from Eigen to compute the distance10:45
blackburnn4nd0: yes as distances are compute using eigen10:45
blackburnn4nd0: ah btw11:01
blackburnwill you have time next friday?11:02
blackburnfriday's night to be more precise11:02
n4nd0i think so11:03
n4nd0why?11:03
n4nd0blackburn: ^11:05
blackburnn4nd0: chris will have some time and we planned to hack paper11:05
n4nd0blackburn: cool11:06
n4nd0I will be available then11:06
blackburnn4nd0: nice11:06
blackburnn4nd0: I think a few hours should be enough once everything software related is ready11:07
n4nd0ok11:07
blackburnn4nd0: we should release tapkee 0.1 I think11:10
n4nd0blackburn: it sounds good11:10
blackburnn4nd0: may be friday too11:10
n4nd0blackburn: what do we need to have ready before release?11:11
blackburnn4nd0: I don't know - it is pretty ok alread11:11
n4nd0cool11:11
blackburnn4nd0: I wanted to attempt to fix gpu11:11
blackburnn4nd0: we aalso should get shogun released too :)11:15
n4nd0blackburn: it is GPUDenseMatrixOperation and GPUDenseImplicitSquareMatrixOperation the parts that use GPU right?11:15
n4nd0blackburn: hehe yeah11:15
blackburnn4nd0: exactly11:15
n4nd0blackburn: the results were normally the same, weren't they?11:26
n4nd0it was only a matter of speedup?11:26
blackburnn4nd0: I thought they weren't the same11:26
n4nd0I remember I tested it11:26
blackburnand it was just slower?11:27
n4nd0I don't remember if the results were always the same11:27
blackburnhmm okay11:27
-!- shogun-notifier- [~irker@7nn.de] has joined #shogun13:00
shogun-notifier-shogun: Sergey Lisitsyn :master * 064b53c / examples/undocumented/libshogun/library_hdf5.cpp: https://github.com/shogun-toolbox/shogun/commit/064b53cd4ac01a8438efb7abb4268d87b359c1bf13:00
shogun-notifier-shogun: Added missed free in library hdf5 example13:00
-!- travis-ci [~travis-ci@ec2-107-22-88-191.compute-1.amazonaws.com] has joined #shogun13:05
travis-ci[travis-ci] it's Sergey Lisitsyn's turn to pay the next round of drinks for the massacre he caused in shogun-toolbox/shogun: http://travis-ci.org/shogun-toolbox/shogun/builds/500021913:05
-!- travis-ci [~travis-ci@ec2-107-22-88-191.compute-1.amazonaws.com] has left #shogun []13:05
shogun-buildbot_build #229 of ubu1 - libshogun is complete: Failure [failed compile]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/ubu1%20-%20libshogun/builds/229  blamelist: Sergey Lisitsyn <lisitsyn.s.o@gmail.com>13:11
shogun-buildbot_build #827 of deb3 - modular_interfaces is complete: Success [build successful]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/deb3%20-%20modular_interfaces/builds/82713:47
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Quit: leaving]14:11
shogun-notifier-shogun: Sergey Lisitsyn :master * de82da5 / examples/undocumented/libshogun/library_serialization.cpp: https://github.com/shogun-toolbox/shogun/commit/de82da565a84e6a9fac01b7ba1ee2924fde772ed14:28
shogun-notifier-shogun: Fixed leaks in library serialization14:28
shogun-notifier-shogun: Sergey Lisitsyn :master * 5fcbf7b / src/shogun/lib/tapkee/ (3 files): https://github.com/shogun-toolbox/shogun/commit/5fcbf7b7f0df8492e99e21847dee86a876eb2d0114:28
shogun-notifier-shogun: Old eigen support in tapkee and laplacian eigenmaps fix14:28
-!- travis-ci [~travis-ci@ec2-107-22-88-191.compute-1.amazonaws.com] has joined #shogun14:32
travis-ci[travis-ci] it's Sergey Lisitsyn's turn to pay the next round of drinks for the massacre he caused in shogun-toolbox/shogun: http://travis-ci.org/shogun-toolbox/shogun/builds/500115814:32
-!- travis-ci [~travis-ci@ec2-107-22-88-191.compute-1.amazonaws.com] has left #shogun []14:32
shogun-buildbot_build #230 of ubu1 - libshogun is complete: Success [build successful]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/ubu1%20-%20libshogun/builds/23014:54
shogun-buildbot_build #231 of ubu1 - libshogun is complete: Failure [failed compile]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/ubu1%20-%20libshogun/builds/231  blamelist: Sergey Lisitsyn <lisitsyn.s.o@gmail.com>15:01
shogun-notifier-shogun: Sergey Lisitsyn :master * 77b9c8e / applications/ (32 files): https://github.com/shogun-toolbox/shogun/commit/77b9c8e936ab5258165f22b44e8800e7b501a5fe15:26
shogun-notifier-shogun: Updated old edrt folder in applications15:26
shogun-buildbot_build #232 of ubu1 - libshogun is complete: Failure [failed compile]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/ubu1%20-%20libshogun/builds/232  blamelist: Sergey Lisitsyn <lisitsyn.s.o@gmail.com>15:31
-!- travis-ci [~travis-ci@ec2-107-22-88-191.compute-1.amazonaws.com] has joined #shogun15:33
travis-ci[travis-ci] it's Sergey Lisitsyn's turn to pay the next round of drinks for the massacre he caused in shogun-toolbox/shogun: http://travis-ci.org/shogun-toolbox/shogun/builds/500192615:33
-!- travis-ci [~travis-ci@ec2-107-22-88-191.compute-1.amazonaws.com] has left #shogun []15:33
blackburnubu1 is not being updated15:34
blackburnargh15:34
shogun-buildbot_build #233 of ubu1 - libshogun is complete: Success [build successful]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/ubu1%20-%20libshogun/builds/23315:52
blackburnfinally15:52
-!- hoijui [~hoijui@dslb-092-078-043-220.pools.arcor-ip.net] has joined #shogun17:00
shogun-notifier-shogun: Sergey Lisitsyn :master * 9554dab / src/shogun/structure/StructuredModel.h,src/shogun/structure/libbmrm.h: https://github.com/shogun-toolbox/shogun/commit/9554dab08344a46be07757dfb2ab0a9f8d8f50a317:26
shogun-notifier-shogun: Added some doc17:26
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has joined #shogun17:37
shogun-buildbot_build #831 of deb3 - modular_interfaces is complete: Failure [failed test python_modular]  Build details are at http://www.shogun-toolbox.org/buildbot/builders/deb3%20-%20modular_interfaces/builds/831  blamelist: Sergey Lisitsyn <lisitsyn.s.o@gmail.com>18:11
-!- n4nd0 [~nando@s83-179-44-135.cust.tele2.se] has quit [Quit: leaving]18:22
-!- hoijui [~hoijui@dslb-092-078-043-220.pools.arcor-ip.net] has quit [Ping timeout: 252 seconds]18:38
-!- wiking [~wiking@huwico/staff/wiking] has joined #shogun19:10
wikinghu19:10
blackburnhu!19:10
-!- zxtx [~zv@c-24-18-130-24.hsd1.wa.comcast.net] has quit [Ping timeout: 255 seconds]19:10
-!- zxtx [~zv@c-24-18-130-24.hsd1.wa.comcast.net] has joined #shogun19:22
-!- shogun-notifier- [~irker@7nn.de] has quit [Quit: transmission timeout]20:26
-!- hoijui [~hoijui@141.23.95.39] has joined #shogun20:32
-!- hoijui [~hoijui@141.23.95.39] has quit [Ping timeout: 264 seconds]20:44
@sonney2kwiking, please bring bsd1 buildbot back to live21:06
wikingsonney2k: shit it's again down21:19
wikingup and runnnin21:19
-!- hoijui [~hoijui@dslb-092-078-043-220.pools.arcor-ip.net] has joined #shogun21:55
@sonney2kwiking, thx22:03
@sonney2kwiking, can we get rid of this warning:22:03
@sonney2khttp://shogun-toolbox.org/buildbot/builders/bsd1%20-%20libshogun/builds/691/steps/test/logs/warnings%20%281%2922:03
-!- alexlovesdata [~binder@e178001222.adsl.alicedsl.de] has joined #shogun22:20
alexlovesdataanybody still out there?22:20
alexlovesdataI want to make CRandomFourierGaussPreproc applicable to CDotfeatures22:21
alexlovesdatait seems that I have to derive a base class from CPreproc to define an interface for that22:21
alexlovesdatamy question is: what apply() interface that base class (I call it CDotPreproc) derived from Cpreproc should have?22:23
blackburnalexlovesdata: hey there22:28
alexlovesdatahey blackburn :)22:28
blackburnalexlovesdata: let me check22:28
alexlovesdataI will derive it from CPreprocessor ...22:28
blackburnalexlovesdata: do you need out of sample for that preprocessor?22:30
blackburnI guess so..22:30
alexlovesdatawhat do you mean by out of sample?22:31
alexlovesdatauntil now it is derived only from CDensePreprocessor<float64_t>22:32
blackburnalexlovesdata: you need to 'learn' it on training data and then apply for test, right?22:32
alexlovesdatano ...22:32
alexlovesdatayou initialize it once22:32
alexlovesdatafor testing you need to save the init parameters22:32
alexlovesdatabut the init parameters do not depend on the data22:33
alexlovesdatahowever one needs to obtain a kernel width  estimate prior to using that ..22:33
blackburnalexlovesdata: so you need to apply it to features only?22:35
alexlovesdatayep22:35
blackburnalexlovesdata: I am afraid no generic interface is here..22:35
alexlovesdatabut for application to test settings I need to save the settings via get_randomcoefficients ;)22:35
alexlovesdataI think the same ...22:35
blackburnwhoops22:35
alexlovesdatano problem ...22:35
blackburnthese functions are pretty wrong22:36
alexlovesdatafor me it is not clear how to apply this to sparse features other than mistreating them as dense ones22:36
blackburnalexlovesdata: what is the output of applying it to sparse features?22:36
alexlovesdatasame ... for string features I would need a gaussian distribution in string space22:36
alexlovesdatathe output is a dense feature with a changed dimensionality22:37
blackburnalexlovesdata: preprocessor part is heavily broken in some means :D22:37
blackburnokay lets see22:37
blackburnI do not mind to change that I just need to realize how22:37
alexlovesdataye ... to me it seems from the interface part also somewhat strange22:37
alexlovesdatatypically it would make sense to use it with streamingfeatures or any features over a numeric type22:38
alexlovesdataa preprocessor gets one feature type as input but returns another feature type as output22:39
blackburnalexlovesdata: it is actually 'converter'22:39
alexlovesdatait should work on a feature matrix and on a single feature22:39
blackburnalexlovesdata: all dimension reduction is in 'converter' folder22:39
alexlovesdatafor preprocessor we need a routine for checking the input and the output type22:39
alexlovesdataof the feature22:40
blackburnalexlovesdata: okay what about inheriting it from embeddingconverter22:40
alexlovesdataeg RF preproc has as input type a set of allowed features ...22:40
blackburnshogun/converter/EmbeddingConverter.h22:40
alexlovesdatai look it up22:40
blackburnalexlovesdata: it fits your requirement22:41
blackburninput is any features22:41
blackburnobject22:41
blackburnoutput is dense22:41
alexlovesdataye ... except that it lacks a check for allowed input feature type22:42
alexlovesdatahowever that can be done during runtime22:42
blackburnalexlovesdata: yeah just check it in embed method22:42
blackburnalexlovesdata: I can help you with first step22:42
alexlovesdataI need to check about conflicts with double inheritance w.r.t. to apply22:42
blackburnI can move it to converter and then you will fix things you'd like to fix22:43
blackburnalexlovesdata: double inheritance?22:43
alexlovesdataye, from CDensePreprocessor<float64_t>22:43
blackburnalexlovesdata: no, it won't be inherited from dense preprocessor then22:44
alexlovesdataand from Embeddingconverter22:44
blackburnno, no double inheritance :)22:44
alexlovesdatadoes it make sense to move it from the preprocessors to converters?22:44
alexlovesdataany gain in usability?22:44
blackburnalexlovesdata: well more generic interface (that embed method)22:45
blackburnand apply22:45
-!- hoijui [~hoijui@dslb-092-078-043-220.pools.arcor-ip.net] has quit [Ping timeout: 252 seconds]22:45
blackburnalexlovesdata: so just cleaner interface22:47
blackburnI do not know what to do with preprocessor yet22:47
blackburnalexlovesdata: you don't use it as on-the-fly thing, right?22:48
alexlovesdatawith Cstreamingfeatures it would make pretty sende22:49
alexlovesdatasense22:49
alexlovesdatabut for that i need an interface which converts one feature vector into another22:49
blackburnhmm then I'd keep it in preprocessor22:49
@sonney2kalexlovesdata, err err22:49
alexlovesdataie x -> R(x)22:49
blackburnyou already have it here22:49
alexlovesdataerr err = ? thinking error?22:49
blackburnmaster here22:49
@sonney2kalexlovesdata, you should implement your gaussianfourierfeatures form scratch22:50
blackburnhaha I'd say something about shogun from scratch22:50
@sonney2kdon't even think about wasting brain & cpu cycles!22:50
alexlovesdatawith what goal/ direction Sonne?22:50
@sonney2kalexlovesdata, speed!22:50
alexlovesdatahuh22:50
@sonney2kthere is almost no use for the stuff you did in the preprocessor!22:50
blackburnI see nothing terribly wrong here22:50
blackburnsonney2k: what is wrong?22:51
alexlovesdatai have no experience w.r.t. to speed optimizations22:51
@sonney2kthe difference is that dotfeatures only iterate over non-zero features22:51
@sonney2kfor speed22:51
alexlovesdatabut if you tell me what you have in mind I can understand hopefully22:51
blackburnbut don't say anything about std::copy :D22:51
@sonney2kso doing a transformation x-> Phi(x) is not helpful22:51
@sonney2k(which is what preprocessors do)22:51
blackburnalexlovesdata: he probably means you need to emulate dot and add22:52
blackburnnot to compute it explicitly22:52
alexlovesdataor he means that I use the dot from it22:52
blackburnsonney2k: I wish it was possible with HKM but it would be wrong22:54
alexlovesdatain principle it can be done IF I can embed the vector as a dot feature itself ...22:54
alexlovesdatathat depends on the type of the dot feature22:54
blackburnalexlovesdata: are these random features still relevenat?22:54
blackburnrelevant*22:55
alexlovesdataif you like to approximate a gaussian kernel by transforming features22:55
alexlovesdataso that the scalarproduct of the feature transform approx a gaussian, ye22:55
@sonney2kalexlovesdata, did you ask andrea about the post workshop stuff?22:55
blackburnalexlovesdata: I mean HKM seems to be pretty cool enough22:55
alexlovesdatanot yet, Sonne22:55
alexlovesdataHKM = hierarchical k-means?22:56
@sonney2kalexlovesdata, please do next week...22:56
alexlovesdataye, Sonne22:56
blackburnalexlovesdata: homo gay map22:56
@sonney2kthx22:56
blackburnalexlovesdata: err22:56
blackburnhomogeneous kernel map22:56
alexlovesdatain principle it is the same22:56
alexlovesdatafor gaussian kernel22:56
blackburnalexlovesdata: yeah but HKM can't approximate gaussian22:57
alexlovesdatasome of the HKM kernels are not useful in practice22:57
alexlovesdatai would say gaussian is a special case of HKM approximations22:57
blackburnalexlovesdata: which ones? I had good experience with any kernel with HKM..22:57
blackburnalexlovesdata: ah I wanted to ask you some things about my stupid kind of research22:58
@sonney2kalexlovesdata, regarding dotfeatures - just implement sparse_dot and add_to_dense vec as fast as possible22:58
@sonney2kso no copying around etc22:58
alexlovesdataRF is just another case of homogeneous kernel map22:58
@sonney2kand e.g. for sparse features it helps a lot to not first create a dense feature vector :)22:59
@sonney2kbut just operate on the non-zero components (if possible)22:59
@sonney2konly then the DotFeatures framework is fast22:59
@sonney2kgtg22:59
alexlovesdataSonne ... I would say: it should suffice to embed the stuff in a dotfeature itself23:00
alexlovesdatathen I could use its dot23:00
alexlovesdataif you say: implement sparse_dot ... you mean for the preprocessor?23:00
alexlovesdataas far as I see it Sonne ... I would need to define for each dot feature type a way to define the gaussian in the space of the dot feature23:02
alexlovesdatathats not generic23:02
alexlovesdatalike gaussian in string space23:02
alexlovesdatai would need to do that for each derived class (!)23:03
alexlovesdataI have to think about that ...23:05
alexlovesdataI see no fast ad hoc solution right now23:05
alexlovesdataok I think I have an idea ... for Dotfeatures and for creating of streamingfeatures from existing ones ...23:21
-!- alexlovesdata [~binder@e178001222.adsl.alicedsl.de] has left #shogun []23:55
--- Log closed Sun Feb 24 00:00:49 2013