Open in new window / Try shogun cloud
--- Log opened Thu Jun 13 00:00:43 2013
-!- iglesiasg [d58f3258@gateway/web/freenode/ip.] has quit [Quit: Page closed]00:38
@wikingi think i have bagging done00:41
-!- travis-ci [] has joined #shogun01:14
travis-ci[travis-ci] it's Viktor Gal's turn to pay the next round of drinks for the massacre he caused in shogun-toolbox/shogun:
-!- travis-ci [] has left #shogun []01:14
-!- HeikoS [] has joined #shogun01:30
-!- mode/#shogun [+o HeikoS] by ChanServ01:30
-!- HeikoS [] has quit [Quit: Leaving.]01:41
shogun-buildbotbuild #426 of nightly_default is complete: Success [build successful]  Build details are at
-!- nube [~rho@] has quit [Quit: Leaving.]04:49
-!- nube [~rho@] has joined #shogun05:41
-!- nube [~rho@] has quit [Ping timeout: 246 seconds]06:18
-!- nube [~rho@] has joined #shogun07:11
-!- Yanglittle [deb20af8@gateway/web/freenode/ip.] has joined #shogun08:12
-!- lambday [67157f4e@gateway/web/cgi-irc/] has joined #shogun08:19
lambdaysonney2k: sonne|work: morning :)08:21
Yanglittlehey, i have a problem. I download the fm_train_real.dat and fm_test_real.dat, and run chi2square it says "can't convert string to float"08:27
sonne|worklambday: hey good morning!08:38
sonne|worklambday: and moin moin too :D08:41
lambdaysonne|work: morgen morgen :D08:41
-!- lambday [67157f4e@gateway/web/cgi-irc/] has quit [Quit: - A hand crafted IRC client]08:54
-!- patrickp [] has joined #shogun09:00
-!- iglesiasg [c1934d16@gateway/web/freenode/ip.] has joined #shogun09:02
hushellMorning, guys09:02
-!- mode/#shogun [+o iglesiasg] by ChanServ09:02
@iglesiasghushell, patrickp hi!09:02
@iglesiasgsorry for the delay :S09:02
patrickpiglesiasg: hi, no worries, also just joined 2 mins ago09:03
hushelliglesiasg: You said hi to me last week, but I missed :(09:03
@iglesiasghushell: yeah! I was going to ask you something about optimization09:03
@iglesiasgbut I ended up convincing myself about it :)09:04
hushelliglesiasg: I am not very familiar with opt :(09:04
hushelliglesiasg: about SSVM?09:04
@iglesiasghushell: it was something about the cutting plane algorithm for SSVM09:04
@iglesiasgbut never mind, it is fine now :)09:05
hushelliglesiasg: then I know a little bit :)09:05
hushellBut I am sure Patrick knows much more than me09:05
patrickpiglesiasg: for next time, feel free to send me an email, I'll try to answer in a timely manner :)09:06
@iglesiasghushell, patrickp so how is it going with the design of the SO project?09:06
@iglesiasgpatrickp: ok, I will do, thanks!09:06
hushellpatrickp: Did you send them the newest doc?09:06
patrickpnope, not yet, sorry09:07
patrickpone thing we still have to figure out: how flexible is CStructuredLabels and CFeatures09:08
hushellyeah, feel free to say something about the new plan09:08
@iglesiasgI am checking the document09:09
@iglesiasgwhat would you like to do with CStructuredLabels and CFeatures?09:09
patrickpiglesiasg: vector<CStructuredInput> and vector<CStructuredOutput> are placeholders, it might be that CFeatures and CStructuredLabels are sufficient09:09
@iglesiasgok, was going to ask about that right now :)09:10
@iglesiasgabout the second point in idea/motivation09:10
@iglesiasgsubclass vs. function pointers09:10
patrickpwell, the thing is SOP can be used for very different problems: factor graphs, multi class, parse trees, matchings09:11
@iglesiasgok, I understand09:11
patrickpit would be good if the "input" and "output" is somewhat flexible09:11
hushellWe try to make the things similar as the SVMStruct09:11
@iglesiasgyes, the idea is that one is able to inherit from CStructuredLabels to create labels for these different types of structures09:12
patrickpand I'm not sure what's the best way to support this, an abstract input/output type that we subclass for each problem domain09:12
@iglesiasgpatrickp: what could be an alternative?09:12
hushellThe struct input and output space are important for such a framework09:12
patrickpthat's probably the only solution, yes09:13
hushelliglesiasg: Maybe you can talk about the design you came up last year09:13
@iglesiasghushell: ok09:14
@iglesiasgso there is this abstract class for the structured model09:14
hushellthen we could think about where should be refined, since for the generalized SO, lots of things need to reimplement09:14
@iglesiasgthe idea is that for each problem domain you inherit from this class and implement the joint feature map, the loss and the inference09:16
patrickpok, that seems like what we thought, and what I think we would like to keep09:16
@iglesiasgnow, about parameter estimation09:17
@iglesiasgwe have this CStructuredOutputMachine that inherits from CMachine in Shogun09:17
patrickpCMachine is used elsewhere in Shogun, I suppose, right?09:18
@iglesiasgyes, it is used in many places in Shogun09:18
@iglesiasgall the SVMs, many classifiers, etc origin in CMachine09:19
sonne|workpatrickp: yeah it is the most generic learning machine09:19
patrickpsorry for the silly questions, still working on getting my head around the shogun sources :)09:19
hushellpatrickp: the SOSVM ~= StructuredSVMMachine09:19
@iglesiasgpatrickp: sure :)09:19
@iglesiasgin this way you can get nice things working for all the machines e.g. model selection and x-validation09:19
sonne|workmachines basically have train() and apply() functions09:20
patrickpapply = predict?09:20
sonne|worktrain gets features as input, apply too and returns label09:20
patrickpgreat, understood09:21
sonne|workwhatever label may mean (graphs, vectors ...)09:21
sonne|workpatrickp: the name is apply because it is not always prediction - machines could do anything09:21
hushellbut why we have m_loss in CStructuredOutputMachine?09:21
@iglesiasghushell: normally you use the hinge loss, it is not the only choice though09:22
patrickpsonne|work: what do you mean by "do anything"?09:22
@iglesiasgI believe09:22
hushellthis could be included in CStructuredModel seems09:22
@iglesiasghushell: mmmm I am not entirely sure what makes more sense09:22
@iglesiasgit could be, yeah09:22
hushelliglesiasg: because then user can specify loss in his inherited model class09:23
hushellloss seems independent of solver09:23
@iglesiasghushell: but isn't this loss something that is just used for parameter estimation?09:23
patrickpiglesiasg: currently it seems there is some reference to it in both, no?09:23
@iglesiasgthere are two losses in here09:24
@iglesiasgone is the structured or09:24
sonne|workpatrickp: on apply machines just do some operation on the data and return a Label object - so they are free to do whatever they want09:24
@iglesiasgsorry, or \Delta loss -- this one is in the structured model09:24
@iglesiasgthe most common one here is the Hamming loss09:24
patrickpi see, ok, thanks for the clarification09:24
@iglesiasgbut apart from that, you have the loss you use during parameter estimation09:25
@iglesiasgI think you can say it is the loss you use to evaluate the risk you want to minimize during training09:25
patrickpiglesiasg: but do you support anything but max-margin09:25
patrickpotherwise that's true, you can either do max-margin or log-loss or something else09:25
@iglesiasgI am not sure about this, let's ellaborate09:26
patrickpsonne|work, iglesiasg: why are these virtual CBinaryLabels* apply_binary(CFeatures* data=NULL); etc in the CMachine base class?09:26
@iglesiasgpatrickp: I think it is because apply in general returns CLabels*, these methods are specialized to return one of the specific types like CBinaryLabels, CMulticlassLabels, CStructuredLabels09:27
sonne|workpatrickp: we have an issue with interfaces not supporting any means of casting09:28
sonne|workpatrickp: e.g. when using apply() you just get labels back09:28
sonne|workbut then you are stuck with a CLabels object in python09:28
sonne|workyou cannot cast it into CBinaryLabels09:28
sonne|workbecause there is no cast09:28
patrickpI see, hmm, but if you would only support C++ you wouldn't need them09:28
patrickpjust to understand09:29
patrickpgreat, thanks09:29
@iglesiasgrevolution, we throw away the interfaces! :P09:29
sonne|workI think we should move these away - into CLabelsFactory09:29
patrickp:) wasn't saying that, just surprised to see these methods there09:29
@iglesiasgpatrickp: just kidding :)09:30
sonne|workpatrickp: no no it is awful IMHO and IIRC gsomix has found a way to do that in python09:30
sonne|workI mean transparent09:30
sonne|workso whenever you call apply() it will return the real type09:30
patrickpthat sounds good09:30
hushelliglesiasg: Where is the argmax() being used?09:31
@iglesiasghushell: you use it both doing parameter estimation to solve the loss-augmented inference and in apply09:31
hushelliglesiasg: I mean in solvers like libbmrm09:31
@iglesiasgthe apply in SO machines is basically the argmax09:31
patrickpcool, that sounds reasonable09:32
hushellI didn't see this being called09:32
@iglesiasghushell: in CPrimalMosekSOSVM is there, let me see about BMRM09:32
hushelliglesiasg: in svm_bmrm_solver() I didn't see it09:33
@iglesiasghushell: I am not familiarized with the BMRMs09:33
@iglesiasgbut they are cutting plane methods as well, so they must use them to find the most violated constraints or so09:34
hushelliglesiasg: Sorry I didn't check the mosek one09:34
@iglesiasghushell: they use them from risk09:34
@iglesiasgin the BMRM09:34
@iglesiasgthe use the risk method in the StructuredModel which in turn uses/implements the argmax09:35
@iglesiasghowever it seems to me there is a "risk" of being redundant here09:35
@iglesiasgpatrickp: coming back to the loss in the SOMachine again09:37
hushelliglesiasg: I see. Thanks :)09:37
patrickpiglesiasg: waiting for that, yes, i'm curious09:37
@iglesiasgso, even if you are doing max-margin with the SO-SVM09:37
@iglesiasgisn't it possible to use different losses apart from the hinge loss?09:38
patrickpyes, but it should be the same that you define in the model, no?09:38
@iglesiasgno, I don't think so09:39
patrickpi mean max-margin = hinge09:39
@iglesiasgis it?09:39
hushellgeneralized hinge loss09:39
hushellwe can say09:39
@iglesiasgI am not actually sure, I thought one can apply different losses to the max-margin or SVM approach09:39
@iglesiasgnot just in structured learning, but in any SVM09:39
patrickpsorry if I am a bit confusing, let me try to clarify09:39
patrickpwhat type of loss are you talking about? Hamming, squared etc?09:40
patrickpi.e. Delta(y_gt, y_pred)?09:40
@iglesiasgpatrickp: no, not about the Delta loss09:41
patrickpso you are talking about the surrogate loss then09:41
patrickpmax-margin, log-loss etc09:41
@iglesiasgit can be09:42
@iglesiasgI am not sure because I probably don't use the right name (surrogate it seems to be) to call them :)09:42
patrickpthat's fine, it's a general problem in the field09:42
hushellpatrickp: maybe you talk beyond the SSVM, I didn't see people use log-loss for structured learning09:42
patrickpsure, yes, SSVM = max-margin09:43
patrickpthere are just two versions: slack and margin rescaling09:43
patrickpbut that's about it09:43
hushellfor CRF, some people like Michael Collins use the log-loss09:43
patrickpCRF = log-loss09:44
@iglesiasgpatrickp: so it doesn't really make sense to train a SSVM using a logistic loss or square loss instead of the hinge loss09:44
patrickpthat's the difference between SSVM and CRF, the surrogate loss09:44
hushellpatrickp: SSVM can also used for CRF? Am I right?09:44
@iglesiasghushell: you can use the SSVM for parameter estimation of CRFs I think09:45
hushelliglesiasg: THis is what I mean09:45
patrickpwell, SSVM and CRF in the end do the same thing: they estimate the parameters of a structured model, the difference is the objective used for learning09:45
@iglesiasghushell: ok09:45
@iglesiasgpatrickp: aham09:45
patrickpCRF uses log-loss, SSVM uses max-margin. CRF in the end has a probabilistic interpretation, but you might not care about this too much09:46
hushellpatrickp: well, I would like to call CRF a type of graphical model, but I understand what you are saying09:46
patrickpyes, sure. but the terminology is quite loaded.09:47
hushellokay, let's back to Shogun, this seems not big issues09:47
@iglesiasgI am wondering then, whether it is possible to use the same training algorithm only changing the part where the loss is evaluated09:47
hushell:) good to learn09:47
@iglesiasgindeed :)09:47
patrickpiglesiasg: yes, and no: usually you have a specialized solver for each surrogate loss, depending on its properties09:48
patrickpalso, you might need to compute different things from your structured model09:48
patrickpargmax was partition function09:49
patrickpvs. that is09:49
hushelliglesiasg: user need to specify the delta function, but he/she can choose SSVM or CRF to estimate params09:49
@iglesiasgI see09:49
hushellSSVM and CRF use different surrogate losses09:50
@iglesiasgpatrickp: so with yes and no you mean there are actually algorithms for parameter estimation that work with different surrogate losses?09:50
patrickpstochastic gradient descent is probably one such example09:50
patrickpshould work well with both surrogate losses09:51
patrickpbut the surrogate loss computation would require different methods from your structuredmodel09:51
@iglesiasgthat is very interesting I think09:51
hushellpatrickp: you think we should also implement the log-loss for training?09:51
-!- foulwall [] has joined #shogun09:51
patrickphushell: at least in the beginning I would not go into this09:52
-!- Yanglittle [deb20af8@gateway/web/freenode/ip.] has quit [Ping timeout: 250 seconds]09:52
patrickpbecause then you need to support partition function computations in your structured model, which is a pain09:52
hushellSo what should we add or modify the the StructuredOutputMachine and StructuredModel?09:53
@iglesiasgI think the idea is to add other StructuredModels and SOMachines rather, I may be wrong09:54
hushellI mean in the doc, we have the new class SOSVM, this seems the same as CStructuredOutputMachine09:54
@iglesiasgyes, I think there is no need for it. The current should suffice09:54
hushellpatrickp: for the partition function, this is all about the inference, I am also curious how many functionality about inference we'd like to add to Shogun09:55
@iglesiasgthere is another question in the google docs regarding eigen vectors09:56
@iglesiasgmy opinion is use them in your implementation code but try to stick to shogun vectors for the methods that could be used from the interfaces in the other languages09:57
patrickpiglesiasg: thanks for the clarification wrt eigen09:58
hushelliglesiasg: so SGVector is preferred?09:58
patrickpfor the interface, i guess09:58
patrickpif you do something internally use eigen, is that right?09:58
@iglesiasgpatrickp: yes, internally do it as you prefer09:58
@iglesiasgeigen is probably a good option for it09:58
hushellI see, no confusion for users09:59
patrickphushell: partition function: i would really concentrate on argmax09:59
lisitsynwe have no typemaps for eigen types09:59
lisitsynthat's why :)09:59
@iglesiasghushell: exactly, it is a reason for the typemaps rather than confusion09:59
@iglesiasghushell: with the typemaps I mean09:59
@iglesiasgsay you have09:59
@iglesiasgEigen::Vector get_vector() in a StructuredModel10:00
-!- nube [~rho@] has quit [Ping timeout: 276 seconds]10:00
@iglesiasgthat would probably fail from an interface (say python) if there are no typemaps for Eigen::Vector10:00
@iglesiasgon the other hand, we have typemaps for SGVector so that method could be used in python10:01
lisitsyniglesiasg: with prob = 1 :D10:01
@iglesiasgand you would get a nice numpy.array10:01
@iglesiasglisitsyn: :D10:01
patrickpanother question: you provide labels/features in structuredModel but also in the Machines10:01
patrickpwhy this duplication?10:01
@iglesiasgpatrickp: yeah, I also wondered about a couple of weeks ago :)10:02
patrickpbut that's not some form of convention used in shogun, right?10:02
lisitsynpatrickp: no if it can be unentangled it would be better10:02
-!- foulwall [] has quit [Ping timeout: 264 seconds]10:03
patrickpcool, i'm sure one can get rid of it in one place, the question is more where10:03
@iglesiasgpatrickp: I think it may be because Machine has an attribute CLabels* m_labels10:03
patrickphmm, i see10:03
@iglesiasgand since the StructuredOutputMachine inherits from it, it gets it10:03
patrickpsame with features?10:04
@iglesiasgbecause IIRC  the duplication is only with labels, isn't it?10:04
@iglesiasgI think it is only with the labels yeah10:04
patrickpyes, that's probably true, only labels10:04
patrickphmm, quite ugly :), I'll think about it10:05
hushellpatrickp: CStructuredInput and CStructuredOutput can be just CFeatures and CStructuredLabels?10:06
@iglesiasgpatrickp: note that CLinearStructuredOutputMachine's getter and setter for the features is done through the StructuredModel10:06
patrickphushell: yes, I think so10:06
@iglesiasgso about avoiding the duplication10:06
patrickpiglesiasg: that makes sense, cool, so now I understand this a bit better10:06
@iglesiasgone could either remove the CStructuredLabels in the CStructuredModel10:07
hushellpatrickp: great, we are merging to current framework10:07
@iglesiasgwhich would be easier than removing CLabels in CMachine10:07
@iglesiasghowever, at least for me, that would be sort of weird10:07
@iglesiasgsince the labels would be in the Machine and the features in the StructuredModel -- I don't like how that would look10:07
patrickpiglesiasg: yes, I agree, have to think about this a bit more10:08
@iglesiasgI suggest, let's stick to that for the moment and later we can try to improve that10:09
@iglesiasgIIRC I took care so when you set the labels in the StructuredModel they are also set in the StructuredMachine10:09
patrickpok, now a more practical question: how should we proceed with the partial "reimplementation": I guess a new feature branch in git, right?10:10
@iglesiasgor maybe  I didn't :S10:10
@iglesiasgpatrickp: what do you mean with re-implementation?10:11
@iglesiasgmake your new stuff?10:11
patrickpcleaning up the SO framework a bit, I still feel there are a bit too many dependencies. But it wouldn't be a new implementation.10:12
patrickpbut it might be that we change quite a lot of things and break backwards compatibility, at least for some time.10:12
@iglesiasgaham I understand10:12
@iglesiasgI'd say try to change things so that backwards compatibility is not broken :) but than can be sub-optimal taking into account the new development10:13
@iglesiasgI am unsure actually10:14
patrickpsure, that's a bit my question: I think it might slow down development quite a bit, but maybe we just start off with keeping back-wards compatibility and see how it goes10:14
@iglesiasgthe issues that I see are10:14
@iglesiasgpatrickp: yes, I like that option better10:14
patrickpand in case it slows shell down too much, we can still switch modes10:14
@iglesiasgyes, I agree10:14
@iglesiasgI will try to get a couple of unit tests done ASAP10:15
@wikingpatrickp: start a new branch10:15
hushellI hope would be a problem to merge to old code in the end10:15
@iglesiasgso that we can easily figure out if something old gets broken10:15
patrickpiglesiasg: that would be great10:15
hushellI mean would not be10:16
patrickpbranching: well, then maybe the patches might be rather small initially, say "remove-loss-from-machine"10:16
@wikingpatrickp: and u can remove the setting of labels/features into the machine itself and keep it in the structmodel... and what u can do (if u want) that set the reference to the labels/features of structmodel to the machine's labels/features classes...10:16
@iglesiasgwiking: but the reference to the labels in the machine is in CMachine10:17
@wikingiglesiasg: and?10:17
@iglesiasgwiking: I guess whole shogun will get broken if we remove that one :D10:17
@wikingiglesiasg: no you don't need to remove that10:17
@wikingbut either don't set it10:18
@wikingor if it's really necessary then set it just in the constructor of structedmachine by getting a reference to the labels in structredmodel10:18
@iglesiasgpatrickp, wiking the StructureOutputMachine constructor can be easily changed to not accept this argument10:18
@iglesiasgthe duplication would still be there... but less visible :)10:19
@wikingbut structuredMachine should have something like this as ctor: StructuredMachine(StructuredModel)10:19
hushellIf we keep most of things the same in CStructuredModel and CStructuredOutputMachine, shouldn't be many problems, right?10:19
@wikingabout the loss function you can argue where u want to put it10:19
@iglesiasghushell: no, there shouldn't be problems then10:19
patrickpwiking: that's true, passing structured model to the machine makes a lot of sense10:19
@wikingwithin the model (makes more sense for me) or within the machine10:19
patrickpi would also put it in the model10:20
@wikingso just do that10:20
@wikingit's not breaking anything10:20
@wikingjust make sure of those changes... and that's it10:20
patrickpcool, that sounds good10:20
@wikinghushell patrickp both gsocers?10:21
patrickphushell: shall we make this your first target, so that you get to fiddle around a bit with the structured machine in shogun?10:21
patrickpwiking: hush ell is a grocers, i'm his mentor10:21
hushellwiking: I am the student :)10:21
@wikingpatrickp: hehe ok :)10:22
@wikinghushell: ok10:22
patrickpbut unfortunately don't know too much about the shogun internals :(10:22
hushellpatrickp: good idea10:22
@wikinghushell can u plz tag me in your PRs directly10:22
@wikinghushell: so that i can see immedietly and give you feedback10:22
hushellwiking: ok, once I have one10:22
@wikingas i've been working with structedmachines quite some last year10:23
hushellwiking: Thanks :)10:23
@wikingchanged a lot in the codes around october/november10:23
hushellwiking: I see. the latent things10:23
@iglesiasgwiking is the latent guy :D10:23
@wikinghushell: yeah still some git stashes are in my local git repo10:24
@wikingthat never got into shogun yet10:24
hushellYou did so many things last year :D10:24
@wikingneed to fixt that soon10:24
patrickpwiking: I see, ignored the latent things completely so far.10:24
patrickphow far progressed is that10:24
@wikingpatrickp: well i have the fully working latentSOmachine here locally10:24
hushellwiking: Did you implement the loss based learning?10:24
@wikingpatrickp: works with both of the machines in shogun10:25
@wikingbut i had to create for example10:25
@wikinga proxy class10:25
@wikingthat proxied between10:25
@iglesiasghushell, patrickp : regarding the other part of the google doc, the FactorGraph10:25
@wikingStructedModel and LatentModel10:25
@wikinghushell: loss based learning....? where?10:25
@iglesiasghushell, patrickp : I am not sure inherit from CStructuredModel10:26
@wikingiglesiasg: where's that doc? :)10:26
@iglesiasgif it should* (I am eating words for breakfast :D)10:26
hushellwiking: I remember you referenced a ICML12 paper, Pawan Kumar10:26
@wikinghushell: ah yeah10:26
@wikingi have an objection10:27
hushelliglesiasg: FactorGraph is an application like Multiclass10:27
@iglesiasgaham my understanding about FactorGraphs is limited10:28
@wikingSGVector<float64_t> feature_map(CStructuredInput*, CStructuredOutput*);10:28
@wikingnot good idea10:28
@iglesiasgI thought of it like something more general10:28
hushellso we call it general SO problem10:28
@iglesiasgas a data structure basically10:28
hushellHMM, CRF, MRF are basically factor graph10:28
hushellexpressed in FG, precisely10:29
hushellwiking: why?10:29
@wikingbecause the feature_map this way REQUIRES that the mapped feature vector is always a dense vector10:29
patrickpwiking: can you elaborate?10:29
@wikingand this is what i already hated last year10:29
patrickpah, cool, so what do you suggest?10:30
@wikingbecause imho it would be a good idea to support both dense and sparse vectors10:30
@wikingas feature vectors10:30
patrickpsure, that's very true10:30
@wikingthis limitation for no reason10:30
@wikingjust because of ease of development10:30
@wikingthe problem is that we already had last year10:30
patrickpis there a basic vector class that has dense and sparse subclasses?10:31
@wikingthat we dont have an abstract class for Vectors10:31
patrickpi see10:31
@wikingpatrickp: yep that's the problem10:31
hushellpeople usually implement two versions, another one for sparse10:31
@wikinghushell: but how can you have either one or the other be non-abstract ;)10:32
@wikingor just have 2 of those functions10:32
@wikingin the clasee10:32
@wikingbut of course10:32
@wikinghere comes the limitation of c++10:32
hushellwiking: That's true10:32
@wikingsame function with different return values are not possible10:32
@iglesiasgwiking: have you tried in any task the joint feature map with sparse and non-sparse vectors?10:33
@wikingi mean you cannot just have SGVector<float64_t> feature_map(CStructuredInput*, CStructuredOutput*); and SGSparseVector<float64_t> feature_map(CStructuredInput*, CStructuredOutput*);10:33
@wikingand one more thing10:33
@wikingi would love that the feature_map function's argument10:33
@wikingis more free10:33
hushellthen don't return a vector10:33
patrickpeven more? like what?10:33
@wikingsomething like10:33
@wikingReturnValue featureMap (CStructuredInput*, CStructuredOutput*, ..);10:34
@wikingReturnValue featureMap (CStructuredInput*, CStructuredOutput*, ...);10:34
@wikingReturnValue featureMap (CStructuredInput*, CStructuredOutput*, vargs);10:34
@wikingbecause this way for example the latentSOMachine can use the same StructedModel10:35
patrickpso that you can pass some auxiliary parameters in there?10:35
@wikingbut for that i need the feature map to support more than 2 parameters as input10:35
@iglesiasgI guess that is to pass the latent features or so10:35
@wikingsince i have the latentVariable as well10:35
patrickpi see, cool, yeah you mean \phi(x,y,z)10:35
@wikingpatrickp: indeed10:35
@wikingwhere h = latent10:36
patrickpi use z, but doesn't matter ;)10:36
patrickpcool, yeah, this makes a lot of sense10:36
@wikingregarding this question: "use of Eigen vectors or rather the ones from shogun"10:36
@wikingsome of the SGVector functions are actually wrapped functions of Eigen3 functions for vectors...10:37
@wikingif of course Eigen is available10:37
@wiking"function pointers" = cant do it because of modular interfaces10:37
patrickpthanks for your inputs!10:38
@wikingi hope i'm not being too redundant, i.e. you already might have heard this answers :P10:38
patrickpwiking: very few, but it's always useful to see a different angle10:39
patrickpwiking: also, what is your experience with CFeatures and CStructuredLabels, are they flexible in your opinion10:39
hushellwiking: what's the ReturnValue you mentioned? a class?10:39
@wikinghushell: yeah some class... now the question is of course SGVector or SGSparseVector or what10:40
hushellwiking: maybe let them all be SGVector10:41
@wikingpatrickp: well that's a good question10:41
@wikinghushell: well i would love to be able to return a sparse variant as well :P10:41
@wikingbut then again10:42
@wikingdont waste time on that now10:42
patrickpwiking: i definitely agree on the sparse part, but might be tricky. Dynamic typing would be nice, he?10:43
hushellwiking: :D leave this to future10:43
@wikinghushell: yeah something like that10:43
@wikingpatrickp: yeah something would be good ;P10:43
patrickpjust did something like this in matlab and it was two lines of code ;(10:43
patrickpanyway, back to CFeatures and StructuredLabels10:44
patrickpwhat's your opinion?10:44
@wikingwell atm i dont see a better solution10:44
@wikingbut if you have ideas i'm listening :)10:44
@wikingas StructuredLabels can be basically anything10:45
@wikingi dont see a better way to implement it as it is now10:45
patrickpcool, that's good enough, just wanted some opinion of a user10:45
@wikingi mean we need to fit into *Labels class hierarchy of shogun that's for sure10:45
@iglesiasgI don't really see why to concern about the flexibility here, I mean is it something that looks like a limitation10:46
patrickpiglesiasg: no, I don't really know. I just want to make sure that in the end it will be very easy to get different applications going10:47
@iglesiasgpatrickp: aham I see10:47
patrickpand it would be a bummer if we make the model and machines as general as possible, but then there are hidden assumptions in features and labels10:47
patrickpthat's all10:47
@iglesiasgso for the moment I have used it for multiclass classification (proof-of-concept basically) and label sequence learning10:47
@iglesiasgand got to it to work fine with it10:48
@iglesiasgwell I have also done some grid graphs but re-using the labels for label sequence learning10:48
patrickpcool, that sounds encouraging10:48
patrickpwell with a factorgraphlabel this should all be the same10:48
hushelliglesiasg: Is your image seg working now?10:49
@iglesiasgas you may have seen CStructuredLabels are pretty much a subtype of CLabels10:49
@iglesiasgthat contains a list of CStructuredData10:49
@iglesiasgCStructuredData is another abstract type10:49
@iglesiasgand you can inherint from it to put pretty much anything you need10:49
@iglesiasgfor instance I have done a CStructuredData that is a CSequence for label sequence learning10:50
patrickpyes, I saw this, thanks for the pointer10:50
@iglesiasgand another that is just a number for multiclass classification10:50
@iglesiasghushell: at the end I gave up the idea of doing a real world segmentation example10:50
@iglesiasgI just did a lame simplification of segmentation10:51
hushelliglesiasg: you could make some synthetic images10:51
@iglesiasgyeah exactly10:51
@iglesiasgjust straight bars of different colours I did10:51
@iglesiasghushell: have a look
-!- nube [~rho@] has joined #shogun10:52
hushelliglesiasg: For inference you reuse something from the Pystruct?10:52
@iglesiasgpatrickp: I guess the style of the document is familiar :)10:52
@iglesiasghushell: I used the linear programming relaxation10:53
patrickpiglesiasg: nice!10:53
@iglesiasghushell: a couple of figures with results are on page 65 of the printed document (79 of the pdf)10:53
hushellcong! You graduated!10:54
@iglesiasgwell not really10:54
@iglesiasgI am doing my presentation tomorrow!10:54
hushelliglesiasg: Nice work!10:54
@iglesiasgI am nervous as fuck hahaha10:54
@iglesiasgI am preparing the presentation now, I finished the slides yesterday10:54
@iglesiasgpatrickp: I hope you don't mind I basically copied the style from the sample you've got in github :S10:55
patrickpok, I think I'll have to go. thanks to everyone for the inputs! hushell: I'll send you an email sometime soon with hopefully an updated work plan, I think it's a good idea to first "fix" the two things discussed today, to get to know the internals a bit better. More about this soon10:56
patrickpiglesiasg: no worries, sure, that's why I put it on github :)10:56
patrickpand good luck with your presentation10:56
hushellpatrickp: okay, Thanks for the schedule, I'll begin to work tomorrow10:57
-!- patrickp [] has quit [Quit: patrickp]10:57
hushellI need to go to sleep now. Have a nice day guys10:58
@iglesiasgbye, great talking to you guys10:58
-!- hushell [] has quit [Quit: WeeChat 0.3.7]10:58
lisitsynwiking: so you say spinlock is not being detected on your machine?11:38
-!- gsomix [~Miranda@] has joined #shogun11:40
@wikinglisitsyn: noup11:46
-!- van51 [] has joined #shogun11:46
lisitsynwiking: can you please try to compile that test .cpp then?11:46
@wikinglisitsyn: oh lol11:49
@wikinglisitsyn: case sensitive! :)11:49
lisitsynwiking: where have I been insensitive?11:49
@wikingmmm where's the announcebot :S11:54
@wikinganyhow it is SpinLock and not Spinlock11:55
lisitsynwiking: I see11:55
lisitsynwiking: I didn't mean to be insensitive sorry I hurt you ;)11:55
@wikingfucking hell i cannot find a decision tree implementation that's decent enough11:56
@wikingi've checked yesterday night11:56
@wikingmaaaan fuck11:56
@wikingthat code is craaaaaaaaazy11:56
@wikingi mean it's gpl11:56
@wikingbut actually no sane person would take that code into their codebase11:57
@wikingfull of globals and weird typedefs11:57
lisitsynwiking: hehe11:57
@wikinglisitsyn: if u know anything let me know11:57
lisitsynwiking: yeah sure11:57
lisitsynwiking: I guess trees are too difficult engineering wise for the most of researchers11:58
lisitsynyou know their code quality hah11:58
@wikingi found this... it's not thaaat bad:
lisitsynahh yeah I have seen that11:58
lisitsyndidn't check the code11:58
@wikingit's not that bad11:59
@wikingit needs some hacking around to get it into shogun11:59
@wikingbut it's certainly more useable than c5011:59
lisitsynwiking: ohh that's fantastic quality comparing to other code by kilian11:59
lisitsynwiking: his MT-LMNN is totally embarrassing11:59
lisitsynspaghetti of horror12:00
@wikingok brb12:00
-!- travis-ci [] has joined #shogun12:10
travis-ci[travis-ci] it's Viktor Gal's turn to pay the next round of drinks for the massacre he caused in shogun-toolbox/shogun:
-!- travis-ci [] has left #shogun []12:10
-!- van51 [] has quit [Quit: Leaving]12:12
-!- van51 [] has joined #shogun12:12
-!- van51 [] has quit [Remote host closed the connection]12:15
-!- van51 [] has joined #shogun12:15
-!- HeikoS [] has joined #shogun12:16
-!- mode/#shogun [+o HeikoS] by ChanServ12:16
-!- van51 [] has quit [Read error: No route to host]12:17
-!- van51 [] has joined #shogun12:18
-!- votjakovr [] has joined #shogun12:21
lisitsynvotjakovr: long time no see!12:22
votjakovrlisitsyn: Hi, I'm glad to see you too12:23
lisitsynvotjakovr: what's up?12:31
votjakovrlisitsyn: sorry, i didn't warn, that i would be missing. But now I'm finally back12:33
lisitsynvotjakovr: ;)12:33
-!- iglesiasg [c1934d16@gateway/web/freenode/ip.] has quit [Quit: Page closed]12:43
-!- gsomix [~Miranda@] has quit [Quit: Miranda IM! Smaller, Faster, Easier.]13:07
-!- nube [~rho@] has quit [Ping timeout: 246 seconds]13:19
-!- nube [~rho@] has joined #shogun13:57
-!- van51 [] has quit [Quit: Leaving.]14:35
-!- lambday [67157e4e@gateway/web/cgi-irc/] has joined #shogun14:42
lambdayHeikoS: exact log job thing tested14:43
-!- iglesiasg [c1934d18@gateway/web/freenode/ip.] has joined #shogun14:59
-!- mode/#shogun [+o iglesiasg] by ChanServ14:59
-!- nube [~rho@] has quit [Ping timeout: 252 seconds]15:03
@wikingHeikoS: fyi:
@HeikoSlambday:  nice! send the PR :)16:23
@HeikoSwiking: nice! :)16:24
@wikingHeikoS: it's in the new branch16:25
@HeikoSwiking:  where is the unit test? otherwise I cannot guarantee you that clone works16:25
@wikingi mean it's coming with random forest i hope16:25
@wikingah the unit test16:25
@wikingyeah i haven't got around that16:25
@wikingbecause actually i read that if we would have a decision tree implementation then we could create a random decision tree and then that we could just put in for bagging :P16:26
-!- iglesiasg [c1934d18@gateway/web/freenode/ip.] has quit [Ping timeout: 250 seconds]16:26
@wikingso the first part is missing actually, i.e. the decision tree16:26
@wikingHeikoS: what's with __ ?16:28
@HeikoSwiking: c++ style says one should not do this16:28
@HeikoSthere was a mail some time ago aon this16:28
@wikingreally? :)16:28
@HeikoSwiking: not a big deal though16:28
@HeikoSwiking: pretty cool the bagging things!16:30
@HeikoSexcited to see that in action16:30
-!- gsomix [~gsomix@] has joined #shogun16:49
-!- gsomix [~gsomix@] has quit [Remote host closed the connection]17:09
-!- votjakovr [] has quit [Quit: ERC Version 5.3 (IRC client for Emacs)]17:19
-!- votjakovr [] has joined #shogun17:34
-!- votjakovr [] has left #shogun []17:37
-!- nube [~rho@] has joined #shogun17:37
-!- nube [~rho@] has quit [Read error: Connection reset by peer]17:42
-!- nube [~rho@] has joined #shogun17:42
-!- nube1 [~rho@] has joined #shogun17:48
-!- nube [~rho@] has quit [Ping timeout: 246 seconds]17:51
-!- gsomix [~gsomix@] has joined #shogun18:16
-!- van51 [] has joined #shogun18:45
-!- zxtx [] has quit [Ping timeout: 276 seconds]18:58
-!- zxtx [] has joined #shogun20:36
-!- HeikoS [] has quit [Quit: Leaving.]21:32
lisitsynsonney2k: cheng soon ong is my action editor22:18
lisitsynthat must be good for me22:18
@sonney2klisitsyn, no it is no...22:49
@sonney2khe will be extra tough22:49
lisitsynsonney2k: he knows what progress is done though22:49
@sonney2klisitsyn, but actually he gets the paper because he had it before and so will reviewers22:50
lisitsynsonney2k: I see22:50
lisitsynsonney2k: we will see22:51
-!- lambday [67157e4e@gateway/web/cgi-irc/] has quit [Quit: - A hand crafted IRC client]23:30
--- Log closed Fri Jun 14 00:00:44 2013