[ml] Social Network
joe at jjhale.com
Fri Nov 19 20:32:27 PST 2010
@Shahin: Great find - really fresh!
Clay mentioned that he'd be interested in giving a talk on
semi-supervised learning at some point - but I don't think he has
scheduled a time yet.
RE Backstrom and Leskovec's Supervised Random Walk paper:
It seems like there are lots of useful ideas to explore there. It is
interesting to note how the method that they develop compares with
logistic regression in terms of the AUC score on their facebook data.
The supervised random walk gets 0.82799 vs logistic regression's
0.81681. This suggests to me that it is probably worth investing time
in fitting a logistic regression model initially even if it does
require "tedious feature extraction".
@Mike: It would be cool if you could give us a talk on the paper,
maybe covering some of the building blocks of their method (like using
regular Random Walks to rank pages etc).
I'll add their recommended features to the candidate feature page.
On 19 November 2010 16:20, Mike Schachter <mike at mindmech.com> wrote:
> Thanks Shahin!
> If people would find it interesting, I can go over the
> paper and present it this coming Wednesday, November 24th.
> Nothing else seems to be going on, right?
> On Fri, Nov 19, 2010 at 10:22 AM, Shahin Saneinejad <ssaneine at gmail.com>
>> I think this paper is a good starting point for a lit review, plus it's
>> very recent:
>> On Thu, Nov 18, 2010 at 11:44 PM, Joe Hale <joe at jjhale.com> wrote:
>>> I've updated the Kaggle ML wiki page to include some subpages for
>>> different elements we'll need to solve. I've also been trying to
>>> figure out the size of the problem - it looks big :) We need to decide
>>> how to keep it manageable.
>>> It would be nice to have a scalable and concrete plan of attack for next
>>> ml mailing list
>>> ml at lists.noisebridge.net
>> ml mailing list
>> ml at lists.noisebridge.net
> ml mailing list
> ml at lists.noisebridge.net
More information about the ml