Publication | Open Access
What Can We Learn Privately?
327
Citations
44
References
2008
Year
Unknown Venue
Artificial IntelligencePrivacy ProtectionEngineeringMachine LearningInformation SecurityEducationCommunicationLearning-by-doingLanguage LearningData ScienceData MiningPrivacy SystemComputations ResearchersKnowledge DiscoveryData PrivacyPrivate Information RetrievalLearning AnalyticsComputer ScienceDifferential PrivacyPrivacyPrivacy LeakageData SecurityCryptographyLearning TheoryStrong Confidentiality GuaranteesPrivate Learning
Learning problems form an important category of computational tasks that generalizes many of the computations researchers apply to large real-life data sets. We ask: what concept classes can be learned privately, namely, by an algorithm whose output does not depend too heavily on any one input or specific training example? More precisely, we investigate learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in the contexts where aggregate information is released about a database containing sensitive information about individuals. We present several basic results that demonstrate general feasibility of private learning and relate several models previously studied separately in the contexts of privacy and standard learning.
| Year | Citations | |
|---|---|---|
1984 | 4.2K | |
1952 | 3.6K | |
1984 | 3.2K | |
2000 | 3K | |
1965 | 2.9K | |
2005 | 2.2K | |
2000 | 1.7K | |
2007 | 1.4K | |
2007 | 1.3K | |
1987 | 1.1K |
Page 1
Page 1