Feature hashing for large scale multitask learning [PDF]

CiteSeerX - Document Details (Isaac Councill, Lee Giles, Pradeep Teregowda): Empirical evidence suggests that hashing is

8 downloads 16 Views 115KB Size

Recommend Stories


Semi-supervised Multitask Learning for Sequence Labeling
Learn to light a candle in the darkest moments of someone’s life. Be the light that helps others see; i

Large-scale Cross-modality Search via Collective Matrix Factorization Hashing
Ask yourself: What are you most grateful for in life? Next

Learning in large scale spiking neural networks
Courage doesn't always roar. Sometimes courage is the quiet voice at the end of the day saying, "I will

Linear Distance Metric Learning for Large-scale Generic Image Recognition
Life isn't about getting and having, it's about giving and being. Kevin Kruse

Approximation Vector Machines for Large-scale Online Learning
Open your mouth only if what you are going to say is more beautiful than the silience. BUDDHA

large-scale supervised learning for 3d point cloud labeling
If you feel beautiful, then you are. Even if you don't, you still are. Terri Guillemets

Does Tail Label Help for Large-Scale Multi-Label Learning
At the end of your life, you will never regret not having passed one more test, not winning one more

Large Scale Feature Extraction from Linked Web Data
Raise your words, not voice. It is rain that grows flowers, not thunder. Rumi

Large-Scale Automated Vulnerability Addition [pdf]
The only limits you see are the ones you impose on yourself. Dr. Wayne Dyer

Idea Transcript


Documents

Authors

Tables

Donate

MetaCart

Sign up

Log in

Include Citations



Advanced Search

Feature hashing for large scale multitask learning by Kilian Weinberger , Anirban Dasgupta , John Langford , Alex Smola , Josh Attenberg

Save to List

Venue:

Correct Errors

In International Conference on Artificial Intelligence

Add to Collection

Download Links

Monitor Changes

Citations: 137 - 23 self

Summary

Cached

Citations

Active Bibliography

[www.cs.mcgill.ca] [icml2009.org] [alex.smola.org] [arxiv.org] [arxiv.org] [arxiv.org]

Co-citation

Clustered Documents Version History Other Repositories/Bibliography

Abstract

DBLP

BibTeX

Empirical evidence suggests that hashing is an effective strategy for dimensionality reduction and practical nonparametric estimation. In this paper we provide exponential tail bounds for feature hashing and show that the interaction between random subspaces is negligible with high probability. We demonstrate the feasibility of this approach with experimental results for a new use case — multitask learning with hundreds of thousands of tasks. 1.

@INPROCEEDINGS{Weinberger_featurehashing, author = {Kilian Weinberger and Anirban Dasgupta and John Langford and Alex Smola and Josh Attenberg}, title = {Feature hashing for large scale multitask learning}, booktitle = {In International Conference on Artificial Intelligence}, year = {} }

Keyphrases large scale multitask learning feature hashing exponential tail bound practical nonparametric estimation new use case multitask empirical evidence random subspace high probability experimental result effective strategy dimensionality reduction

Share

OpenURL

Powered by: About CiteSeerX



Submit and Index Documents



Privacy Policy



Help



Data



Source

Developed at and hosted by The College of Information Sciences and Technology © 2007-2018 The Pennsylvania State University



Contact Us

Smile Life

When life gives you a hundred reasons to cry, show life that you have a thousand reasons to smile

Get in touch

© Copyright 2015 - 2024 PDFFOX.COM - All rights reserved.