Open in new window / Try shogun cloud
--- Log opened Wed Apr 10 00:00:26 2013
blackburnso is DRPK going to have a party?00:01
harpreetHi, I was interested in the 'Develop interactive machine learning demos' for GSoc.00:14
harpreetAny advice on what sort of work I should do before I apply?00:15
blackburnharpreet: this was answered quite a few times on the mailing list - consider starting to contribute with small patches00:19
harpreetSince the project is almost entirely concerned with Python, Django, JS, should I submit patches for the core C++ code or something else?00:22
medeeiipyesterday I asked about development of a CMakelist. So working under windows is possible.... I did wrote the whole cmake list but the problem is compiler is complaining about a lot of things.... like e.g pthread...00:23
medeeiipcan pthread be made optional?00:25
medeeiipthe compiler is complaining about dirent.h too (as that is not available on windows)00:26
medeeiipis there any more additional dependencies too ( one was with sys/types.h but I made that inclusion optional using #ifdef _WIN32 .....#else...code)00:29
blackburnharpreet: no that's not that required to have C++ patches - any examples using python would work too00:33
blackburnmedeeiip: yes pthread is optional00:33
blackburnmedeeiip: that would be very helpful to fix these includes00:34
medeeiipcan you tell me where and how dirent.h is used ( I mean I can see where, actually how?)00:35
medeeiipi mean not specifically00:35
medeeiipbut to give me an idea00:35
blackburnmedeeiip: it is used to find files IIRC00:36
medeeiipso if i need to replace that dependency I need to do the same thing on windows in a alternative way....00:38
blackburnyes probably00:38
medeeiipis there any available list about what feature is optional and what is required?00:41
blackburnfeature of?00:41
medeeiipi mean optional components like (pthread as you said).... specifically what library is required and what is optional?00:43
blackburnahh no I don't think we have a list00:44
blackburnbasically it should compile always00:44
medeeiipit is directly complaining about pthread.....00:44
blackburnmedeeiip: all pthreads code should be guarded with preprocessor00:45
blackburnso that's a bug00:45
medeeiipno no.... hold a sec .... as confg.h generation was not automated i created a config.h where I used #define HAVE_PTHREAD 100:47
medeeiiplet me check if it works without it.00:47
medeeiipan idiot i am....:D00:49
blackburnmedeeiip: yeah HAVE_PTHREAD should not be set if you don't have it00:51
medeeiipwhat is lbfgsfloatval_t datatype?01:05
medeeiipin optimization_libfgs.cpp01:05
blackburnmedeeiip: what kind of answer do you expect? ;)01:08
medeeiipfrom what library it is?01:09
medeeiip'n that is?01:09
medeeiipBFGS method?01:09
medeeiiplib for that?01:09
blackburnlimited memory broyden fletcher goldfarb shanno01:09
blackburnmedeeiip: it is incldued to the sources01:11
medeeiipbut lbfgsfloatval_t is not defined in lbfgs.h01:13
blackburnmedeeiip: where is it used?01:16
medeeiipexample though.....01:18
blackburnit might be it wasn't properly updated01:21
blackburnhave to sleep now01:23
medeeiipit's night at ur country01:23
blackburnmedeeiip: yes deep night slowly becoming a morning01:24
medeeiipsame in india...... but01:24
medeeiipu 'r from germany ?01:24
medeeiipisn't it?01:25
blackburnno, russia01:25
blackburnit is sonney2k who is from berlin01:25
medeeiipi see.....01:26
medeeiipbtw good nite01:26
medeeiipi too need a sleep badly....... tomorrow i've college01:26
-!- medeeiip [~medeeiip@] has quit [Quit: Leaving]01:32
-!- FSCV [~FSCV@] has quit [Quit: Leaving]02:09
-!- foulwall [] has joined #shogun03:15
-!- phd [] has quit [Ping timeout: 248 seconds]03:39
-!- foulwall_ [] has joined #shogun03:44
-!- foulwall [] has quit [Read error: Connection reset by peer]03:44
-!- gsomix [~Miranda@] has quit [Ping timeout: 245 seconds]04:51
-!- blackburn [~blackburn@] has quit [Ping timeout: 245 seconds]04:51
-!- blackburn [~blackburn@] has joined #shogun05:07
-!- rishabh [~rishabh@] has joined #shogun05:34
-!- abinash [75ef5e6e@gateway/web/freenode/ip.] has joined #shogun06:56
-!- gzhd79 [] has joined #shogun06:59
-!- gzhd79 [] has quit [Client Quit]07:03
-!- foulwall_ [] has quit [Ping timeout: 252 seconds]07:33
-!- harpreet [~ceo@] has quit [Read error: Connection reset by peer]07:59
-!- foulwall [] has joined #shogun07:59
@sonney2kblackburn, muhaaaahaa you do boosting now :D08:00
blackburnsonney2k: yeah08:12
blackburnsonney2k: is the code crappy?08:13
@sonney2kblackburn, no idea - but this reminds me of my multiboost endeavour08:14
@sonney2kand I guess their code is massive - they have won a couple of competitions with that08:14
blackburnsonney2k: I do not know what to answer to last message08:14
@sonney2kgtg brb08:15
-!- bogdanc [] has joined #shogun08:22
sonne|workblackburn: haha08:24
blackburnsonne|work: I am scared by your laugh08:25
blackburnsonne|work: is adaboost-reg really that cool?08:30
blackburnI am quite surprised they got back to 2001 paper08:30
sonne|workblackburn: it certainly is a reasonable method08:31
sonne|worklike any other method of course08:31
blackburnsonne|work: what I do not get - do they have C++ impl of their adaboost?08:33
sonne|workblackburn: "they"?08:34
blackburnsonne|work: or him - whatever :)08:34
blackburnsonne|work: he said 'they' are limited with time08:34
-!- n4nd0 [] has joined #shogun08:35
-!- abinash [75ef5e6e@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]08:35
sonne|workblackburn: I don't know whom you mean08:36
blackburnsonne|work: roni08:36
sonne|workahh so not the multiboost guys08:36
sonne|workwell adaboost needs early stopping (or it will overfit)08:37
sonne|workor some kind of regularization08:37
sonne|workand of course it sucks to have too many baselearners in the end08:37
sonne|work(slow when using it)08:37
n4nd0adaboost doesn't overfit (or so said my professor always)08:37
sonne|workthen he has never used it08:37
blackburnhmm how can it *not* overfit?08:38
n4nd0mm I am not completely sure what was her claim08:38
n4nd0let me check slides08:39
blackburnn4nd0: I just can't see any 'rule' that makes it not overfitting algorithm08:39
blackburnI mean how many weak classifiers do we need?08:39
blackburn50, 100?08:40
sonne|workn4nd0: gunnar back in the days wrote a paper about it improving the situation08:40
n4nd0blackburn: what do you want to use it for?08:40
sonne|workback in 1999 or so08:40
blackburnn4nd0: I was stupid enough last night to suggest to help with integrating some adaboost reg08:40
n4nd0blackburn: multiboost code to shogun?08:41
blackburnn4nd0: no, adaboost reg by gunnar08:41
sonne|workblackburn: I guess adaboost* is easy08:41
sonne|workbut you need the framework for baselearners08:41
sonne|workso basically any shogun classifier could be baselearner08:42
blackburnsonne|work: what kind of framework?08:42
sonne|workblackburn: CBaseLearner ?08:42
blackburnsonne|work: you mean some ensemble classifier08:43
n4nd0sonne|work: if any shogun classifier could be a baselearner, why not CMachine?08:43
blackburnthat outputs this weighted sum?08:43
blackburnthat sounds easy - the only thing is how to learn these classifiers and their weights right?08:43
sonne|workn4nd0: yeah CMachine is what you will likely need to use though regression methods etc don't fit08:43
n4nd0aham true08:44
sonne|workblackburn: err no - you have a fixed parameter setting for base learners08:44
sonne|workblackburn: so you just call train(data)08:44
sonne|workand it is not so clear to me how to reweight data08:44
blackburnsonne|work: ah so we need weighting support here08:45
n4nd0found it in slides directly08:45
n4nd0Beauty of AdaBoost08:46
n4nd0among other things08:46
n4nd0Test Errot: Asmptotes - no over-fitting observed. It continues to decrease after training error vanishes.08:46
blackburnn4nd0: sounds like the best method08:46
blackburnn4nd0: that's crazy08:46
blackburnn4nd0: I am sure it will increase as weighted sum of a lot of floats gets not really stable with time08:47
n4nd0blackburn: IIRC the weights are normalized08:47
n4nd0not sure if that helps in what you are referring to08:48
sonne|workn4nd0: read the abstract of*&source=bl&ots=MvrfwEEJIj&sig=G72a-SjKPxM6VIA0Xj_RdPtDYJc&hl=en&sa=X&ei=2QplUbe3KMm7hAePioBI&ved=0CC8Q6AEwAA08:48
sonne|workn4nd0: no overfitting if no noise08:48
sonne|workbut w/ noise (which is always present)08:48
sonne|workyou have it08:48
sonne|workthat is gunnars '99 paper08:48
sonne|workadaboost reg08:48
n4nd0sonne|work: aham ok, I don't really know how to interpret the noise here08:48
sonne|workn4nd0: well any measured data is noise08:49
n4nd0the course I am talking about was really focused on using Adaboost for face detection08:49
n4nd0with this Haar features08:49
sonne|workyeah noisy08:49
sonne|workas hell08:49
blackburnmodern method08:49
sonne|worksvm is from 1997 :P08:49
n4nd0mmm I don't really understand then why it is said like that in the slides08:50
blackburnsonne|work: yeah that's why svm is 'outdated' comparing to deep learning things08:50
sonne|workand neural nets from decades ago :P08:50
blackburnsonne|work: svm is from 1970 actually08:50
sonne|workeven more outdated then08:50
sonne|workdeep learning is just hip again since people had SVMs (with guaranteed convergence) for long enough08:51
sonne|workblackburn: theory of SVM maybe but not the applications08:51
blackburnsonne|work: no it is pretty different08:51
blackburnsonne|work: learning features is important08:51
-!- gzhd79 [] has joined #shogun08:52
sonne|workblackburn: so?08:52
blackburnsonne|work: I can't see SVM learning anything on features so it is quite different I'd say08:52
sonne|workit is like what NN's have always done08:52
sonne|workbut since NN's have local minima08:52
sonne|workthey are extremely hard to control08:52
blackburnsonne|work: when working on images it gets important08:52
sonne|workthe right features are important for any task08:53
shogun-buildbotbuild #355 of nightly_default is complete: Failure [failed test]  Build details are at
n4nd0blackburn: aren't there ways to use large margin SVM-like learning to choose features?08:53
blackburnsonne|work: yes but we should not engineer it08:53
blackburnn4nd0: just *choose*08:53
n4nd0on the meantime test break... :S08:53
blackburnn4nd0: deep learning is about constructing a few levels from low-level (pixels)08:54
sonne|workblackburn: yeah sure that is the dream - but there still is a long way to go08:54
blackburnsonne|work: yes but it is already making some steps to automagic features08:54
sonne|workblackburn: with NNs you choose certain architectures, ways of training etc that have a massive impact on what features it learns08:54
blackburnsonne|work: that's true08:54
sonne|workblackburn: well then use MKL with all kinds of kernels08:55
sonne|workit also learns some features08:55
blackburnsonne|work: but SVM do not consider it at all08:55
sonne|workbut sucks badly08:55
sonne|workI don't agree08:55
sonne|workSVM learns important features in feature space08:55
sonne|workbut you limit the features by choosing a kernel08:55
n4nd0sonne|work: why does the approach MKL with lot of kernels suck?08:55
sonne|workand yes you have a hard time to express these features08:56
blackburnsonne|work: feature selection is totally different from feature extraction08:56
sonne|workn4nd0: it usually gives worse performance than just adding kernels08:56
blackburnsonne|work: I tend to believe we should *extract* features not select08:56
sonne|workI don't08:56
sonne|workI think the more features - the better08:56
blackburnsonne|work: yes, but again back to images08:57
blackburnsonne|work: if we just select pixels08:57
blackburnit always sucks08:58
blackburnbut if we have some high-level on it (like sift)08:58
sonne|workblackburn: sure08:59
sonne|workso they use convolutional NN's08:59
sonne|workothers suck too08:59
blackburnsonne|work: yes convolution is a good base to transform pixels to something08:59
blackburnsonne|work: it is engineered to do convolution - that's bad09:00
sonne|workblackburn: believe - a ML method is as good as the person who trains it09:00
sonne|workthere are soo many tricks involved in everything09:01
sonne|workthere usually is no general rule09:01
sonne|worksaying that methodA is better than methodB09:01
sonne|worksome methods are a more natural choice for certain tasks of course but that's it09:01
blackburnsonne|work: I am speaking not about methods but the approach09:01
shogun-buildbotbuild #1009 of deb3 - modular_interfaces is complete: Failure [failed test python_modular]  Build details are at  blamelist: Soeren Sonnenburg <>09:02
blackburnlearning some method on engineered features is (in general) worse09:03
blackburnthan learning features and then method09:03
sonne|workblackburn: to the contrary09:03
sonne|worklearning features & method together is usually worse09:04
sonne|workhumans are usually much better in engineering features09:04
blackburnsonne|work: depends09:04
blackburnsonne|work: with images (it is the only field I know) it is not - sparse features, dictionary learning, convolution stuff - these are better09:05
blackburnsonne|work: e.g there is no heuristic in fitting HOG cell sizes09:06
blackburnit is just stupid model selection09:06
sonne|workblackburn: yes but if you then add prior knowledge and force the thing to focus on reasonable features09:07
sonne|workyou will get even better results09:07
blackburnbah I am still at home should be at job09:08
sonne|workblackburn: and I am at work but hmm... ok working now09:09
blackburnsonne|work: you see09:09
blackburneverything from the top is about extracting09:09
sonne|workyes and lots of is about learning09:09
sonne|workand lots of is about extracting & learning09:09
sonne|workand limiting learning of features09:10
blackburnsonne|work: ensembling + feature extraction + learning09:10
sonne|workk-NN with 0.63!09:10
sonne|workbeats halv of the NN09:11
blackburnsonne|work: very well engineered distance I guess09:11
blackburnshape context matching should be something that matches these curves in a smart way09:11
sonne|workactually even 0.5209:11
sonne|workbetter than any VM09:11
sonne|workcommittee of 35 conv. net, 1-20-P-40-P-150-10 [elastic distortions] width normalization 0.2309:12
blackburnsonne|work: it must be pretty slow09:12
sonne|workand they do distortions etc09:13
sonne|workthe usual invariance tricks09:13
blackburnsonne|work: yes most of top methods do it with distortions09:13
sonne|workrotation/shifting lala09:13
-!- blackburn [~blackburn@] has quit [Quit: Leaving.]09:14
-!- medeeiip [~medeeiip@] has joined #shogun09:28
-!- sumit [73f91219@gateway/web/freenode/ip.] has joined #shogun09:29
-!- gzhd79 [] has left #shogun []09:32
-!- blackburn [] has joined #shogun09:38
blackburnsonne|work: did you like that new mentor?09:42
sonne|workblackburn: 25minutes!09:42
blackburnI had to reject him though09:42
blackburnsonne|work: what 25 minutes?09:42
sonne|workblackburn: yeah cool guy09:42
sonne|workit took you to get to work09:42
blackburnsonne|work: hah yes with no jams it is pretty fast09:43
blackburnsonne|work: 1.5 km actually09:43
sonne|workblackburn: we certainly need more mentors like him09:44
blackburnsonne|work: it takes more time to enter the building and get to that 13th floor09:44
blackburnthan moving by bus09:44
sonne|workthough it is a bit sad that he didn't speak japanese09:44
-!- sumit [73f91219@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]09:44
blackburnsonne|work: he has other cool skills09:45
blackburnsonne|work: he speaks machine code at least09:46
sonne|worktopcoder - within top 10?09:48
sonne|workwe need HEROS09:49
blackburnsonne|work: I don't mind a fluent machine code speaker in team09:49
sonne|workkind of a matrix operator09:49
sonne|workwe should equip ourselves with phone booths to exit the matrix09:50
sonne|workand some reasonable mobiles09:50
blackburnsonne|work: matrix operator is a must have09:50
sonne|workbut only one that can cope with a few liters of vodka per day09:51
blackburnsonne|work: mind releasing 2.1.1 once I get finished with tapkee?09:56
sonne|workyou have to ask our release manager09:56
sonne|workI think it has to be 2.2 btw09:56
blackburnsonne|work: ahhh yes09:56
sonne|workwe changed some api's09:57
blackburnsonne|work: no 2.1.109:57
-!- bogdanc [] has quit [Quit: Konversation terminated!]10:06
-!- n4nd0 [] has quit [Quit: leaving]10:10
-!- rishabh [~rishabh@] has quit [Remote host closed the connection]10:21
-!- Yanglittle [deb20af1@gateway/web/freenode/ip.] has joined #shogun10:40
YanglittleHi, I have a question, how should I do when I want to combine the pre-computed kernel ? I didn't find the examples in the shogun.10:42
sonne|workYanglittle: just add the kernel like any other kernel10:44
sonne|workyou just don't need any features...10:47
Yanglittlethis is the usage of add_kernel: sg('add_kernel', weight, kernel-specific parameters)  the third parameter is replaced by pre-computed kernel matrix?10:50
blackburnsonne|work: your biggest mistake was to formulate that demos idea :D10:51
sonne|workohh static interfaces10:52
sonne|workYanglittle: please use a modular interface10:52
blackburnwe have somewhat 5 6 7 may be 8?10:52
-!- n4nd0 [] has joined #shogun10:52
sonne|workblackburn: in worst case we create one project for interactive *on the web* and one for shogun's examples10:53
sonne|workblackburn: hmmhh this adaboost code must be magic...10:54
blackburnsonne|work: that would push some ML idea out..10:54
sonne|workI still don't know how you get weights for examples to work w/ shogun10:54
sonne|workbut hey10:54
blackburnsonne|work: I LOVE MAGIC CODE10:54
sonne|workblackburn: why should it push out an ML idea?10:55
blackburnsonne|work: # slots?10:55
sonne|workwell last year we gave one back10:55
blackburnsonne|work: true10:55
sonne|workblackburn: look at
sonne|workhe also has the orginal adaboost algorithm in there11:05
blackburnhahah funny code11:06
sonne|workI didn't know that
sonne|workthe adaboost authors got that prize11:06
sonne|workbut for real boosting use
blackburnsonne|work: in two words - what is the problem with integrating multiboost?11:07
sonne|workblackburn: it is huuuge11:08
blackburnsonne|work: and one more important Q - why multiboost doesn't have adaboost reg11:08
sonne|workit has everytthing invented like we do but different ...11:08
sonne|workit has adaboost MH11:08
blackburnsonne|work: what is MH?11:08
sonne|workseems like for multilable11:09
sonne|workand w/ confidences11:10
sonne|workbut no regularization!?11:10
blackburnI mean if adaboost reg is kind of cool - why didn't they implement it11:10
blackburnthat seems to be strange at least11:11
sonne|workI guess early stopping is good enough11:13
-!- foulwall [] has quit [Remote host closed the connection]11:20
-!- gsomix [~Miranda@] has joined #shogun11:30
gsomixhi guys11:30
n4nd0hey gsomix!11:30
medeeiiphello everyone I's working to build the shogun on windows.... seems almost I've to  entirely rewrite math.h using _win32 preprocessor11:40
medeeiipbut currently i'm stuck with this error:11:40
medeeiipuse of undefined type 'shogun::shogun::CStringFeatures<ST>'11:40
medeeiip1>          with11:40
medeeiip1>          [11:40
medeeiip1>              ST=uint16_t11:40
medeeiip1>          ]11:40
n4nd0ups rewriting code... that must be touch11:41
medeeiipfrom src\shogun/distributions/HMM.h(582)11:41
n4nd0medeeiip: how are you trying to build it? using cygwin?11:41
medeeiipno native vs compiler......11:41
n4nd0mmm I am not entirely sure what you mean11:41
blackburnmedeeiip: you probably didn't define a thing that enables that type11:41
n4nd0in any case I think that the easiest way to get shogun working in windows is with cygwin, algthough I have no experience11:42
medeeiipit's not about working.... I've enthu of porting shogun to windows...11:43
n4nd0ah ok I see11:43
medeeiipabout that error:11:43
medeeiipthere is a def :11:44
medeeiipCStringFeatures<uint16_t>* p_observations;11:45
medeeiipone more thing i did....11:45
blackburnmedeeiip: -DUSE_UINT16 and so on11:45
medeeiipimplmented mmanwin3211:45
-!- 18VAAX8BZ [] has joined #shogun11:46
-!- phd [] has joined #shogun11:46
-!- 18VAAX8BZ [] has left #shogun []11:46
medeeiipblackburn: I didn't understand11:46
medeeiipwhat u just said11:46
medeeiip -DUSE_UINT16 where?11:47
blackburnmedeeiip: you should have USE_UINT16 define to have features for uint16_t enabled11:47
medeeiipactually i used a #ifdef _WIN32 and included stdint.h11:48
medeeiipwhere uint16_t is available11:48
medeeiipI also implemented mman-win32 from
medeeiipso memory management shouldn't be a problem.....11:52
blackburnmedeeiip: hmm wait it is not necessary to have that define11:53
blackburnit is relevant for interfaces but not for anything else it seems11:53
-!- heiko1 [] has joined #shogun11:56
blackburnn4nd0: thanks for answering these messages11:58
n4nd0blackburn: :) it's nothing11:59
medeeiipblackburn: any hack for that.......12:02
-!- arp [73f88294@gateway/web/freenode/ip.] has joined #shogun12:06
arphey.shogun does not work in ubuntu 64 bit rite?12:07
n4nd0arp: Hi, it does work.12:07
n4nd0what problem do you have?12:08
arpi was installing it via the ubutu software center,it jept on failing,saying that it failed to  download some packages12:09
arpor some packages could not be downloaded12:09
n4nd0Aham, I assumed you were installing from source, sorry.12:09
n4nd0I didn't actually know you can install it via de software center :)12:10
n4nd0arp: What package in particular did you try to install? I have just tried with libshogun11 and it worked fine here.12:12
-!- sumit [73f91219@gateway/web/freenode/ip.] has joined #shogun12:12
YanglittleIt can't find the module when I run "from shogun.Kernel import GaussianKernel" and I can't find them on my PC..12:13
n4nd0Yanglittle: I guess you built shogun for python_modular. Did you make install after building as well?12:14
-!- arp [73f88294@gateway/web/freenode/ip.] has quit [Quit: Page closed]12:15
Yanglittlethree steps I all executed.12:16
n4nd0Yanglittle: is it possible that you have more than one python version in your system?12:17
n4nd0it could be that shogun was built against one of the versions and you are trying to run it using another12:17
-!- ZeThomas [~thomaspr@2a02:2c40:100:a000:a81c:41e5:277a:c326] has joined #shogun12:18
Yanglittlei installed two versions ago. now just version 2.712:20
Yanglittlei will rebuild it12:20
n4nd0make sure the version of python detected by shogun is the same one you want to use12:21
n4nd0detected by shogun = detected by the configure script12:21
Yanglittlei will, thanks.12:22
ZeThomashey I get an error while compiling shogun2.112:23
n4nd0ZeThomas: and the error is?12:23
ZeThomasi just installed openblas (and removed atlas)12:24
ZeThomasand wanted to recompile12:24
ZeThomasnumpy 1.7 has been recompiled successfully just now12:24
n4nd0mm then maybe it is imcopatible with openblas?12:25
ZeThomasI don't know, is it?12:25
n4nd0I didn't know about it, it might be12:26
-!- jptech93 [~quassel@] has joined #shogun12:28
-!- arp [73f88294@gateway/web/freenode/ip.] has joined #shogun12:28
n4nd0ZeThomas: excuse me?12:28
ZeThomasn4nd0: I should watch my language, sorry12:29
-!- blackburn [] has quit [Quit: Leaving.]12:33
-!- foulwall [~foulwall@2001:da8:215:6901:69a5:d6ae:aaf0:6ef1] has joined #shogun12:34
-!- blackburn [] has joined #shogun12:36
-!- heiko1 [] has quit [Quit: Leaving.]12:42
-!- medeeiip [~medeeiip@] has quit [Quit: Leaving]12:49
-!- abinashpanda [75ef5e6e@gateway/web/freenode/ip.] has joined #shogun12:59
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]13:04
-!- jptech93 [~quassel@] has joined #shogun13:05
-!- abinashpanda [75ef5e6e@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]13:07
-!- shrey [ca4eaca2@gateway/web/freenode/ip.] has joined #shogun13:11
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]13:15
-!- jptech93 [~quassel@] has joined #shogun13:16
ZeThomasblackburn: do you know if shogun supports openblas?13:18
blackburnZeThomas: should be possible I think13:24
ZeThomasi compiled it, and successfully numpy with it13:25
ZeThomasbut shogun gives me errors13:25
blackburnZeThomas: ehmmm 311 looks strange to me :)13:28
blackburnwhy 'I' is treated as complex13:28
blackburnZeThomas: what is the compiler?13:28
blackburnZeThomas: could you please try to put some char('I') or so?13:30
blackburn311 and 347 of src/shogun/mathematics/lapack.cpp13:31
ZeThomasso line 311 becomes: char I = char('I') ?13:33
ZeThomasdone, I'm trying to recompile now13:36
* ZeThomas goes for coffee while shogun compiles13:38
blackburnZeThomas: you may use make -j4 to use 4 cores13:39
ZeThomasoh, that's handy :)13:39
sonne|workand ccache!13:40
blackburnyes and ccache13:40
-!- n4nd0_ [] has joined #shogun13:42
-!- n4nd0 [] has quit [Quit: Reconnecting]13:42
ZeThomasso "make -j8 -ccache" ?13:42
ZeThomasblackburn: still the same error13:42
blackburnZeThomas: no, ccache is a wrapper on top of compiler13:44
blackburnnot a key13:44
blackburnZeThomas: hmm let me think13:44
-!- n4nd0_ is now known as n4nd013:44
n4nd0ZeThomas: no, ccache make -j813:44
-!- n4nd0 [] has quit [Client Quit]13:44
-!- n4nd0_ [] has joined #shogun13:45
-!- n4nd0_ [] has quit [Client Quit]13:45
-!- n4nd0 [] has joined #shogun13:45
ZeThomasthanks n4nd013:45
blackburnn4nd0: ?? would that work?13:46
sonne|workIIRC you need to install ccache and put a certain dir in $PATH13:46
sonne|worksuch that it calls ccache when gcc is called13:47
blackburnand replace CXX and CC with ccache g++13:47
n4nd0blackburn: do you normally use it as make --ccache?13:47
n4nd0blackburn: I always do ccache make blah blah13:47
blackburnn4nd0: so does that work with ccache make?13:47
n4nd0hehe I hope so13:47
blackburnsonne|work: any idea why 'I' is treated as complex?13:48
blackburnZeThomas: okay other guess13:48
blackburn'\I' maybe?13:48
sonne|workwhat is going on?14:03
sonne|workblackburn: my suggestion is to use char range = 'I';14:06
sonne|workthen it should work14:06
sonne|workblackburn: I think it treats I as a complex nr14:06
-!- arp [73f88294@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]14:10
-!- sumit [73f91219@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]14:11
ZeThomasyes that seems the case, so is this an OpenBLAS specific thing then?14:12
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]14:15
-!- jptech93 [~quassel@] has joined #shogun14:17
ZeThomasblackburn, sonne|work, this seems to have done the trick: I refactored every occurence of [varname] I to range14:17
ZeThomashowever it gives errors for SGMatrix.cpp (, I suspect of the same kind14:18
sonne|work    SGMatrix<int8_t> I(size, size);14:19
sonne|workcould you please call this identity there?14:20
sonne|workZeThomas: and please send us a git pull request against the develop branch14:20
ZeThomassonne|work: I'd be delighted, but you might want to talk me through it, I'm a git noob...14:23
sonne|workn4nd0: could you     please help ZeThomas?14:25
n4nd0back again14:27
sonne|workn4nd0: ^ please help him to set up git if you have time  - thanks!14:28
n4nd0ZeThomas: so how is it going14:28
n4nd0although I have not tried yet myself git-flow, blackburn or someone else can tell us if we need some help about it though14:30
ZeThomasok n4nd0, I git cloned git://
blackburnyeah sure14:30
ZeThomasand made the changes14:30
n4nd0ZeThomas: ok then let's start again14:31
blackburnZeThomas: I am afraid you would need to transfer your changs to your fork14:31
blackburnZeThomas: is it ubuntu/debian you are running?14:31
ZeThomasblackburn: yes14:31
n4nd0because you fork the main repository and clone that one14:31
blackburnZeThomas: please install git flow (sudo apt-get install git-flow)14:31
n4nd0blackburn: I guess however it can enough enough changing the remotes?14:31
blackburnn4nd0: ha good point14:31
ZeThomasblackburn: done14:32
blackburnnow you have to 'git flow init'14:33
ZeThomasok: Branch name for production releases: [master]14:34
n4nd0ZeThomas: give me a second14:34
n4nd0this was already explained in the mailing list14:34
blackburnZeThomas: yes that's default14:34
n4nd0I can send you a link14:34
blackburnZeThomas: next release is develop14:35
ZeThomasn4nd0: please do14:35
ZeThomassonne|work: it still fails to compile:
n4nd0Crtl+f workflow14:36
n4nd0mail by Sergey on April 4th14:36
sonne|workZeThomas: argh - please rename the varialbe I in line 106 to index14:37
-!- n4nd0 [] has quit [Ping timeout: 246 seconds]14:41
ZeThomassonne|work: in which file?14:42
ZeThomasblackburn: ok git flow is set up14:42
sonne|workI guess there too shogun/lib/external/libqp.h:6114:45
-!- votjak [] has joined #shogun14:48
-!- kpn [73f88294@gateway/web/freenode/ip.] has joined #shogun14:48
-!- n4nd0 [] has joined #shogun14:49
n4nd0got connection troubles14:49
blackburnZeThomas: once it compiles do 'git flow support complex_literals'14:49
blackburncomplex_literals is the name of branch to be created14:49
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]14:55
ZeThomasblackburn: Unknown subcommand: 'complex_literals'14:56
ZeThomassonne|work: it compiles now, thanks. Is this an OpenBLAS specific thing?14:56
-!- kpn [73f88294@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]14:57
-!- jptech93 [~quassel@] has joined #shogun14:57
n4nd0ZeThomas: my guess is that there is a global in OpenBlas called I14:57
ZeThomasn4nd0: oh, and in c++, global names take precedence over local ones? or in no case can you have doubles?14:58
n4nd0ZeThomas: with doubles you mean two variables with the same name?14:58
ZeThomasn4nd0: yes14:58
ZeThomasas you can do in python e.g.14:59
n4nd0I am not sure14:59
n4nd0ZeThomas: what could it be otherwise?14:59
blackburnZeThomas: oops sorry14:59
blackburnZeThomas: git flow support start complex_literals14:59
ZeThomasblackburn: Missing argument <base>15:00
n4nd0blackburn: why do you think this error with the I happened?15:00
blackburnn4nd0: no idea but your explanation looks valid15:01
ZeThomasblackburn: I add develop at the end?15:01
blackburnZeThomas: hmm yeah try that15:01
blackburnZeThomas: sorry we changed our model to git flow may be a week ago15:01
-!- medeeiip [~medeeiip@] has joined #shogun15:01
ZeThomasok: You are now on branch 'support/complex_literals'15:01
blackburnZeThomas: okay so call 'git status' and check the files you made changes in15:02
blackburn'git add [filename]' each of them15:02
ZeThomasblackburn: ok, done15:07
blackburnZeThomas: git commit -m 'your message here'15:08
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]15:08
-!- jptech93 [~quassel@] has joined #shogun15:09
ZeThomasblackburn, ok, what's next?15:11
blackburnZeThomas: tricky part15:12
blackburnZeThomas: git remote rename origin upstream15:12
blackburnZeThomas: press the fork button15:13
ZeThomasblackburn: error: Could not rename config section 'remote.origin' to 'remote.upstream'15:13
blackburnZeThomas: hmm okay15:14
blackburnZeThomas: git remote add upstream
-!- foulwall [~foulwall@2001:da8:215:6901:69a5:d6ae:aaf0:6ef1] has quit [Remote host closed the connection]15:17
ZeThomasblackburn: ok15:18
blackburnZeThomas: done?15:18
ZeThomasso now?15:19
blackburnZeThomas: git remote set-url [ssh url to your fork]15:20
blackburnZeThomas: git remote set-url origin [ssh url to your fork]15:21
blackburnZeThomas: you may get that link on the page of your fork15:21
ZeThomasblackburn: fatal: No such remote 'origin'15:25
ZeThomasgit remote add origin ?15:26
n4nd0ZeThomas: origin should point to your fork, not to the main one15:27
n4nd0ZeThomas: it should be something like ...ZeThomas/shogun.git15:27
n4nd0where ZeThomas is your username in github15:27
ZeThomasoh, I'm so bad at git...15:28
ZeThomasso git remote add origin ?15:28
n4nd0it is hard at the beginning, I suggest you to check gitbook and some videocasts15:28
n4nd0ZeThomas: did you already fork in github?15:29
ZeThomasyes, that link is my fork15:29
blackburnZeThomas: better use ssh url15:29
blackburnhowever https would work too15:29
n4nd0blackburn: faster with ssh?15:30
blackburnn4nd0: requires no authorization15:30
ZeThomasok, so I do: git remote add origin ?15:30
blackburnZeThomas: yes15:30
n4nd0blackburn: oh yeah, sure15:30
ZeThomasblackburn: ok, that's done15:31
blackburnZeThomas: ok near to ready!15:31
blackburnZeThomas: try to do git push origin15:31
ZeThomasPermission denied (publickey). fatal: The remote end hung up unexpectedly...15:32
n4nd0ZeThomas: you have to put your publich ssh key in github I think15:33
blackburnZeThomas: just to check could you please git remote -v?15:34
blackburnand paste the output15:34
blackburnZeThomas: alright15:36
blackburnZeThomas: please follow the link n4nd0 put15:36
blackburnand set up a public key15:36
ZeThomasI'm on it15:36
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]15:37
-!- jptech93 [~quassel@] has joined #shogun15:38
ZeThomasblackburn: I get this after the push15:39
blackburnZeThomas: alright15:40
blackburnno problem15:40
blackburnZeThomas: git checkout master15:40
blackburngit pull --rebase origin master15:40
blackburngit checkout develop15:40
blackburngit pull --rebase origin develop15:40
blackburngit checkout support/complex_literals15:40
blackburngit rebase develop15:41
blackburngit push origin support/complex_literals15:41
blackburnsomething like this I think15:41
ZeThomasafter git pull origin master I get: Aborting\n could not detach HEAD15:42
-!- gaurang [] has joined #shogun15:43
blackburnZeThomas: what is the branch you are on?15:46
blackburngit status15:46
ZeThomason branch master15:47
ZeThomasthe files I added have disappeared though :D15:49
blackburnZeThomas: you commited them right?15:49
ZeThomaswell, I guess15:50
blackburnso that's ok they are on a different branch15:50
-!- foulwall [~foulwall@2001:da8:215:6901:d1f1:befd:256d:419] has joined #shogun15:51
blackburnZeThomas: I am confused with thar error15:51
ZeThomasblackburn: you and I both...15:52
blackburnZeThomas: okay lets get back to that support/complex_literals branch15:52
blackburngit checkout complex_literals15:52
blackburngit flow support finish complex_literals15:53
-!- foulwall [~foulwall@2001:da8:215:6901:d1f1:befd:256d:419] has quit [Ping timeout: 245 seconds]15:55
ZeThomasUnknown subcommand: 'finish'15:56
blackburnZeThomas: argghhhh15:57
blackburnmy bad15:57
blackburnZeThomas: git branch -m support/complex_literals feature/complex_literals15:58
-!- yefuneh [3edb8730@gateway/web/freenode/ip.] has joined #shogun15:59
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]15:59
yefunehCan someone point me to a **simple** example using shogun in lua?16:00
-!- jptech93 [~quassel@] has joined #shogun16:00
blackburnyefuneh: something like that16:00
ZeThomasblackburn: done16:00
yefunehno, too complicated16:01
blackburnZeThomas: git feature finish feature/complex_literals16:01
-!- gsomix [~Miranda@] has quit [Read error: Connection reset by peer]16:02
ZeThomasblackburn: I suppose you mean git flow feature finish feature/complex_literals16:02
ZeThomasblackburn: No branch matches prefix 'feature/complex_literals'16:02
yefunehblackburn when looking at svmlight for example, you just run svm_learn and svm_classify, is there something that just shows how to take a training file create a model and then run that model against the test data?16:03
blackburnZeThomas: ohhhh16:03
blackburnZeThomas: was it renamed?16:03
blackburnyefuneh: no probably there is no such example16:04
n4nd0yefuneh: this looks simple enough16:04
-!- heiko1 [] has joined #shogun16:05
blackburnn4nd0: could you please back me up with ZeThomas? have to get to work16:05
n4nd0blackburn: sure16:05
n4nd0blackburn: but not idea about git flow yet :)16:06
blackburnn4nd0: it is just about branches I think16:07
yefunehn4nd0 - thanks. What is load_labels for? My test and training data are in a format like: -1 1:5 2:7 ... where -1 is the label16:07
n4nd0yefuneh: wild guess - load labels is to take the labels from a file and put them in a variable16:09
yefunehn4nd0 thanks16:10
n4nd0yefuneh: if you labels are in that format you can either code your own functions to parse the data in a similar way to how it is done in the load.lua script16:10
n4nd0yefuneh: or you can create new data files following the convention of the files used in the example and re-use the load functions16:10
ZeThomasn4nd0, me to I have to get to work... can we do this some other time?16:11
n4nd0ZeThomas: sure16:11
n4nd0just come around some time16:11
ZeThomasok, cheers16:11
ZeThomascu, thanks for the help16:11
yefunehn4nd0 - I'll have to look at the example files. Based on the examples it seems much more complicated then running svmlight/libsvm directly on the data16:11
-!- ZeThomas [~thomaspr@2a02:2c40:100:a000:a81c:41e5:277a:c326] has quit [Quit: Leaving]16:11
-!- phd [] has quit [Ping timeout: 252 seconds]16:13
-!- frl [73f88294@gateway/web/freenode/ip.] has joined #shogun16:15
-!- phd [] has joined #shogun16:15
-!- foulwall [] has joined #shogun16:17
-!- yefuneh [3edb8730@gateway/web/freenode/ip.] has quit [Quit: Page closed]16:17
-!- frl_ [73f88294@gateway/web/freenode/ip.] has joined #shogun16:17
-!- n4nd0 [] has quit [Ping timeout: 255 seconds]16:18
frl_in the issue "Kernel PCA test/check" . we have to implement th ekernel pca algorithm in python without using the shogun libraries rite?16:18
frl_m i correct?16:18
-!- frl [73f88294@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]16:19
frl_ne suggestions?16:19
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]16:19
-!- jptech93 [~quassel@] has joined #shogun16:20
heiko1frl_: see email16:27
-!- ppletscher [] has quit [Quit: ppletscher]16:36
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]16:39
-!- ppletscher [] has joined #shogun16:40
-!- jptech93 [~quassel@] has joined #shogun16:40
-!- medeeiip__ [~medeeiip@] has joined #shogun16:47
-!- medeeiip [~medeeiip@] has quit [Ping timeout: 245 seconds]16:49
-!- foulwall [] has quit [Remote host closed the connection]16:49
-!- ppletscher [] has quit [Quit: ppletscher]16:56
@sonney2kblackburn, ^17:03
@sonney2kI am just creating some python script to generate a page from that which we can put on the homepage17:03
heiko1sonney2k: ok17:04
heiko1I dont find that overly usefull though17:04
heiko1I think we should rather have a list of what we can do, which is a bit more detailed17:04
heiko1like "features"17:04
heiko1since this matrix oversimplifies somewhat17:04
frl_what does "integration test for kernel pca means?17:04
-!- frl_ [73f88294@gateway/web/freenode/ip.] has quit [Quit: Page closed]17:05
blackburnhe left no chance to answer17:05
heiko1frl_: this saves the results of the example to a file once and then everytime the example is run it is compared against the old result. This way we can avoid that algorithms change wihtout us noticing17:05
-!- kipp [73f88294@gateway/web/freenode/ip.] has joined #shogun17:05
heiko1going for coffee ....17:06
@sonney2kheiko1, blackburn - here it is
kippwhat does integration test for kernal pca mean?does it means that the kernel matrix is valid?17:07
kippm i correct?17:07
blackburnkipp: what is a valid kernel matrix?17:08
blackburnkipp: that was for your probably?17:10
blackburn(7:05:25 PM) heiko1: frl_: this saves the results of the example to a file once and then everytime the example is run it is compared against the old result. This way we can avoid that algorithms change wihtout us noticing17:10
kippblackburn: kernel matrix should be postive definate17:10
blackburnkipp: yeah but it would be a test for kernel not pca, right? ;)17:10
kippyes ,yes.sorry typed pca by mistake17:11
blackburnkipp: integration tests are usually checking if results are staying the same17:11
blackburnso it would be just project something and check if it is the same as it was before17:11
kippso its like a accuracy checker17:11
blackburnkipp: some sort of17:12
kippk,so its like checking ,if the data has become linearly separable or not17:17
blackburnkipp: no the data stays so the output should stay too17:17
kippas in the output data  is same as the input data after kernel pca17:18
blackburnkipp: no as output data is the same after some code is changed17:20
blackburnnaywhayare: how many SFINAE experts so far? ;)17:20
-!- kipp [73f88294@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]17:29
-!- deerishi [73f88294@gateway/web/freenode/ip.] has joined #shogun17:29
-!- phd [] has quit [Ping timeout: 264 seconds]17:29
-!- shrey [ca4eaca2@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]17:30
-!- Yanglittle [deb20af1@gateway/web/freenode/ip.] has quit [Quit: Page closed]17:30
-!- shrey [ca4eaca2@gateway/web/freenode/ip.] has joined #shogun17:34
-!- blackburn [] has quit [Quit: Leaving.]17:36
-!- shrey [ca4eaca2@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]17:41
-!- deerishi [73f88294@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]17:42
-!- gaurang [] has quit [Read error: Connection reset by peer]17:42
-!- phd [] has joined #shogun17:47
naywhayareblackburn: no SFINAE experts but a large number of first-year undergraduates seem interested in the "9/10 difficulty" (as I listed it) project on improving the abstract tree traversals mlpack has...18:03
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:03
naywhayareand when I say "no, really, I promise, enjoy your summer, you can't learn this stuff in two weeks", they seem to end up even more interested18:03
-!- jptech93 [~quassel@] has joined #shogun18:05
-!- phd [] has quit [Ping timeout: 260 seconds]18:08
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:13
-!- jptech93 [~quassel@] has joined #shogun18:15
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:17
-!- jptech93 [~quassel@] has joined #shogun18:19
-!- phd [] has joined #shogun18:19
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:21
-!- jptech93 [~quassel@] has joined #shogun18:23
-!- deerishi [73f88294@gateway/web/freenode/ip.] has joined #shogun18:24
deerishii know Kernal PCA ,but I am still unclear about what do we need to do the kernalpca issue
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:25
-!- jptech93 [~quassel@] has joined #shogun18:27
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:29
-!- jptech93 [~quassel@] has joined #shogun18:31
-!- n4nd0 [] has joined #shogun18:31
heiko1deerishi: the point is to write a nice example to illustrate how the algorithm works18:32
heiko1some easy data, then you do your kernel PCA, and see how it changes the data18:32
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:34
deerishiheikol: thank you18:34
-!- jptech93 [~quassel@] has joined #shogun18:35
-!- shaumik [d2d43ae7@gateway/web/freenode/ip.] has joined #shogun18:36
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:38
-!- jptech93 [~quassel@] has joined #shogun18:39
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:42
-!- jptech93 [~quassel@] has joined #shogun18:43
-!- deerishi [73f88294@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]18:44
-!- shaumik [d2d43ae7@gateway/web/freenode/ip.] has quit [Quit: Page closed]18:45
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:46
-!- jptech93 [~quassel@] has joined #shogun18:47
-!- blackburn [~blackburn@] has joined #shogun18:47
blackburnone thing on kpca18:49
blackburnheiko1: tapkee supports kpca, may be it makes sense to change the code to use it18:50
heiko1blackburn: yes that might be good18:50
heiko1could you update the task?18:50
blackburnheiko1: yeah will do18:50
heiko1or maybe add another one for that18:50
heiko1since the example doesnt depend on the backend18:50
blackburnthat's better18:50
blackburnnaywhayare: I see18:51
heiko1blackburn: could we go into the git issue again? :)18:52
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:52
blackburnheiko1: what is the issue?18:52
heiko1still no idea how to commit ;)18:52
heiko1git flow18:52
blackburnoh shh18:52
blackburnheiko1: could you please remind me the status we got into?18:52
heiko1git flow init18:53
heiko1Which branch should be used for bringing forth production releases?18:53
heiko1   - master18:53
heiko1Branch name for production releases: [master]18:53
heiko1I cannot select development18:53
-!- jptech93 [~quassel@] has joined #shogun18:53
blackburnheiko1: yes because it is not here18:53
blackburnheiko1: is it your fork?18:54
blackburnheiko1: lets try18:54
blackburngit checkout -b develop upstream/develop18:54
heiko1fatal: git checkout: updating paths is incompatible with switching branches.18:55
heiko1Did you intend to checkout 'upstream/develop' which can not be resolved as commit?18:55
blackburngit branch develop upstream/develop?18:55
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]18:56
blackburnheiko1: wait18:56
blackburnheiko1: lets do that18:56
blackburngit fetch upstream18:56
blackburnit should load all branches18:56
heiko1of course18:56
blackburnheiko1: but you'd still need to create your branch too18:56
heiko1remember  when you said git pull --rebase includes git fetch upstream?18:57
blackburnheiko1: yes18:57
blackburnpull is fetch + merge18:57
blackburnor fetch + rebase18:57
-!- jptech93 [~quassel@] has joined #shogun18:57
heiko1okay hzow to fetch the branch now?18:57
blackburnheiko1: does git branch -va18:57
blackburnshow you remote branch18:57
blackburnof develop?18:57
blackburnheiko1: alright18:58
blackburnheiko1: just create develop18:58
blackburnheiko1: I think you need to create a branch based on upstream/develop18:59
heiko1is there a difference between dev and develop?18:59
-!- kakashi_ [~kakashi_@nltk/kakashi] has quit [Ping timeout: 252 seconds]18:59
blackburnheiko1: dev should be deleted18:59
blackburnheiko1: it is deleted but git is distributed so it stayed in your copy19:00
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:00
heiko1blackburn: ok so now I created develop19:00
blackburnheiko1: git push origin develop19:00
heiko1how to update it?19:00
blackburngit rebase upstream/develop19:00
heiko1ok pushed19:01
-!- kakashi_ [~kakashi_@nltk/kakashi] has joined #shogun19:01
-!- jptech93 [~quassel@] has joined #shogun19:01
heiko1and now?19:01
blackburnheiko1: git flow init19:01
heiko1blackburn:  how to update the develop branch?19:01
blackburnheiko1: git gets easier once you get the basic idea that branches are homeomorphic endofunctors mapping submanifolds..19:02
blackburnheiko1: what do you mean by update?19:02
heiko1haha :)19:02
heiko1Which branch should be used for integration of the "next release"?19:02
blackburnnext release is develop19:02
heiko1and production release?19:02
heiko1Ill check your mail19:02
blackburnheiko1: master19:03
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:04
blackburnheiko1: everything is default19:05
heiko1ok worked19:05
blackburnexcept versiontag19:05
heiko1so now I modify develop right?19:05
-!- jptech93 [~quassel@] has joined #shogun19:05
blackburnnow you19:05
blackburngit flow feature start my_awesome_feature19:05
blackburngit commit ...19:05
blackburngit flow feature finish my_awesome_feature19:05
heiko1what does this? merge it into develop19:06
blackburnheiko1: finish merges and deletes the branch19:06
heiko1I see19:06
heiko1and then I push develop?19:06
heiko1whats the command for that?19:06
heiko1just push?19:06
blackburnheiko1: but actually I have absolutely no idea how to handle that better19:06
blackburnheiko1: yes19:06
blackburnheiko1: I'd suggest you to finally commit in the main repository19:06
heiko1at which point?19:07
blackburnPRs are additional headache here19:07
heiko1what do you  mean by that?19:07
heiko1a I see19:07
heiko1no Ill do PR19:07
heiko1better for me19:07
blackburnheiko1: just merge it directly19:07
heiko1I tend to mess up repos19:07
heiko1fucked up mine about 100 times19:07
heiko1so safer for me to work on the fork19:07
heiko1also we have a history19:07
heiko1think thats better19:07
heiko1and travis19:08
heiko1and I can see the diff in the PR view19:08
blackburnheiko1: yeah that has pros and conses19:08
heiko1I know, but Id rather stay on the safe side here19:08
heiko1thanks for the help!19:08
blackburnheiko1: but actually it would be better so consider moving some day ;)19:08
blackburnheiko1: the thing is that we do group commits this way19:09
blackburnso develop could look like a bunch of merges19:09
heiko1thats not bad right?19:09
blackburnideally it looks like19:09
blackburnmerge feature a19:09
blackburnmerge feature b19:09
blackburnmaster is even more compressive19:09
blackburnit contains only tags19:10
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:10
blackburnheiko1: homeomorphic endofunctors! keep that in mind19:10
heiko1blackburn: yeah, whatever :D19:10
-!- deerishi [73f88294@gateway/web/freenode/ip.] has joined #shogun19:10
blackburnheiko1: git flow stuff is just a script19:11
blackburnthat creates branches and deletes it19:11
heiko1I know19:11
blackburnheiko1: just hides a few commands19:11
-!- jptech93 [~quassel@] has joined #shogun19:11
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:14
-!- jptech93 [~quassel@] has joined #shogun19:15
-!- votjak [] has quit [Quit: Leaving]19:16
-!- deerishi [73f88294@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]19:17
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:18
-!- jptech93 [~quassel@] has joined #shogun19:19
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:24
-!- bogdanc [] has joined #shogun19:25
-!- jptech93 [~quassel@] has joined #shogun19:25
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:28
-!- jptech93 [~quassel@] has joined #shogun19:29
-!- ppletscher [] has joined #shogun19:30
-!- jptech93 [~quassel@] has quit [Read error: Connection reset by peer]19:34
-!- jptech93 [~quassel@] has joined #shogun19:35
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:38
-!- jptech93 [~quassel@] has joined #shogun19:39
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:43
-!- jptech93 [~quassel@] has joined #shogun19:44
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:48
-!- jptech93 [~quassel@] has joined #shogun19:49
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:52
-!- jptech93 [~quassel@] has joined #shogun19:53
-!- deerishi [73f88294@gateway/web/freenode/ip.] has joined #shogun19:55
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]19:56
-!- jptech93 [~quassel@] has joined #shogun19:57
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:02
-!- deerishi [73f88294@gateway/web/freenode/ip.] has quit [Ping timeout: 245 seconds]20:04
-!- jptech93 [~quassel@] has joined #shogun20:07
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:11
-!- jptech93 [~quassel@] has joined #shogun20:12
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:16
-!- aegis1 [aegis1@] has joined #shogun20:17
-!- jptech93 [~quassel@] has joined #shogun20:17
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:21
-!- jptech93 [~quassel@] has joined #shogun20:22
-!- gsomix [~Miranda@] has joined #shogun20:25
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:25
n4nd0hey gsomix20:26
n4nd0what is your project search going?20:26
-!- jptech93 [~quassel@] has joined #shogun20:26
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:30
-!- jptech93 [~quassel@] has joined #shogun20:31
gsomixn4nd0: in progress... any ideas? I think I can implement typemaps for model selection (ought to have done a long time ago >__<), but it's not enough.20:33
-!- ppletscher [] has quit [Quit: ppletscher]20:33
gsomixsonney2k: there? good evening20:34
-!- n4nd0 [] has quit [Ping timeout: 264 seconds]20:35
-!- aegis1 [aegis1@] has quit [Ping timeout: 255 seconds]20:35
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:36
@sonney2kgsomix, hey there20:37
-!- jptech93 [~quassel@] has joined #shogun20:38
gsomixsonney2k: how are you? I want to continue the last conversation. what's wrong with R modular interface now?20:38
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:40
-!- phd [] has quit [Ping timeout: 264 seconds]20:41
-!- jptech93 [~quassel@] has joined #shogun20:42
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:46
@sonney2kgsomix, it is crashing20:46
@sonney2kgsomix, IDK if you recall20:47
-!- jptech93 [~quassel@] has joined #shogun20:47
@sonney2kbut whenever some object is transferred from C++ -> python20:47
@sonney2kpython needs to keep somehow track of the object and potentially destroy / deallocate it20:48
@sonney2kthis is done via reference counts20:48
@sonney2kso we register 2 functions20:48
@sonney2kcalled ref()20:48
@sonney2kand unref()20:48
@sonney2kthat take care of this from python20:48
@sonney2kfor R modular these functions seem not to work20:48
@sonney2kat least we just get crashes when these are enabled20:49
gsomixsonney2k: got it20:50
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:53
-!- jptech93 [~quassel@] has joined #shogun20:54
gsomixsonney2k: are there any else interfaces related issues? model selection, R, lua, static interfaces... hm?20:56
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]20:58
-!- jptech93 [~quassel@] has joined #shogun21:00
@sonney2kgsomix, yeah well blackburn had this nice model selection idea21:02
gsomixsonney2k: already know :)21:03
@sonney2kgsomix, it is not a fulltime project though but just something to warm up LD21:03
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]21:03
-!- jptech93 [~quassel@] has joined #shogun21:05
@sonney2kgsomix, otherwise interface wise we have some even harder problem - our wrapper code size is to big. we would need some way of splitting up the swig generated wrapper21:05
-!- heiko1 [] has left #shogun []21:05
-!- n4nd0 [] has joined #shogun21:08
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]21:10
-!- jptech93 [~quassel@] has joined #shogun21:11
gsomixsonney2k: not clear. should this give a profit (less time to recompile or so)?21:12
blackburngsomix: sonney2k: I will try to come up with idea on all these generic programming things tomorrow21:13
@sonney2kgsomix, yes drastically reduced compile time21:16
@sonney2kgsomix, and much less memory requirements21:16
blackburn we have standartizer!21:16
@sonney2kinstead of 3-4GB just few 100MB21:16
-!- n4nd0_ [] has joined #shogun21:17
-!- n4nd0 [] has quit [Ping timeout: 276 seconds]21:17
-!- n4nd0_ is now known as n4nd021:17
blackburnsonney2k: hmm he is going to standartize all libraries on github!21:17
@sonney2kn4nd0, I managed to
@sonney2kconvert this to html with some python script21:19
-!- jptech93 [~quassel@] has quit [Remote host closed the connection]21:19
n4nd0sonney2k: I saw the matrix before, it looks good I think21:19
gsomixsonney2k: ok, thanks a lot. there is a lot to think about21:19
@sonney2kn4nd0, so where do we put it21:20
n4nd0sonney2k: let met see21:20
-!- jptech93 [~quassel@] has joined #shogun21:20
n4nd0sonney2k: somewhere inside about, what do you think?21:20
n4nd0isn't already something similar on the page? about > related projects21:21
n4nd0what about there?21:21
n4nd0maybe a subpage21:21
blackburnsonney2k: I am surprised with that guy - he opens issues in various repos on that21:22
blackburnI just do not really get the motivation beneath21:22
@sonney2kblackburn, yeah but it is cool21:23
@sonney2kstaying within standard (to the extent possible) will keep us out of trouble21:23
blackburnsonney2k: yeah and out of native compiling on window$21:24
@sonney2kn4nd0, it is not just related projects though but also feature comparison21:24
@sonney2kblackburn, why?21:24
blackburnsonney2k: VS heavily violates everything I know21:24
@sonney2kahh that21:25
n4nd0sonney2k: btw why does it appear shogun last updates 2010??21:25
@sonney2kn4nd0, because that is when we updated the .csv21:25
@sonney2kerr google doc21:25
@sonney2kblackburn, I don't like having a UUID in there though21:26
blackburnsonney2k: UUID is ok but ugly21:26
@sonney2kthen everything should be autogenerated21:26
@sonney2kyeah because of ugliness21:26
blackburnsonney2k: I see no actual reason to do that21:26
n4nd0sonney2k: I would go for either about > information > feature comparison or about > related projects > feature comparison21:27
blackburnchances STL get a header named SG_CUSTOM_KERNEL_H_ are ehmm low21:27
n4nd0sonney2k: I think it would be nice though to have a link to it or sth where one can see part of the matrix png from the homepage21:27
@sonney2kn4nd0, agreed21:28
@sonney2kbut it needs updating too21:28
n4nd0sonney2k: what needs updating?21:28
@sonney2kn4nd0, that was pre shogun 1.0 IIRC21:28
@sonney2kwe have like 100'd more kernels21:28
@sonney2kand so might have other projects21:28
@sonney2kand other features of course21:29
n4nd0aaah ok I thought the google docs you have sent around today was up-to-date21:29
-!- shrey [ca4eaca2@gateway/web/freenode/ip.] has joined #shogun21:29
@sonney2kblackburn, well let him do the work no?21:29
blackburnsonney2k: he is not going to do that I think21:30
blackburnbe back later21:30
-!- phd [] has joined #shogun21:49
gsomixtime to install linux21:50
-!- n4nd0 [] has quit [Quit: leaving]21:51
-!- zxtx [] has quit [Ping timeout: 252 seconds]21:51
-!- gsomix [~Miranda@] has quit [Read error: Connection reset by peer]21:58
-!- medeeiip [~medeeiip@] has joined #shogun22:44
-!- jptech93 [~quassel@] has quit [Ping timeout: 245 seconds]22:45
-!- medeeiip__ [~medeeiip@] has quit [Ping timeout: 252 seconds]22:46
-!- bogdanc [] has quit [Quit: Konversation terminated!]22:55
-!- zxtx [] has joined #shogun22:56
-!- ahcorde [553324b1@gateway/web/freenode/ip.] has joined #shogun23:16
shreyHello everyone.23:20
shrey I am new to machine learning.Can someone point me in the right direction so that i can start working on the binary classification problem?links to reading materials/demos would be helpful.23:21
-!- jptech93 [~quassel@] has joined #shogun23:22
--- Log closed Thu Apr 11 00:00:27 2013