Saturday, October 27, 2018

Weekly Sporto bookmarks (weekly)

  • tags: openbadges IP credly digital-credentials

    • During this shift, Pearson filed two patents called “Generation, management, and tracking of digital credentials”.
    • Yet, this summer both patents were granted and with Credly’s purchase of Pearson’s Acclaim this past spring, they have now become the assignee.
    • This means that ALL implementers of the IMS Open Badges v2.0 (OBv2) standard are licensed to any necessary claims under the patent that relate directly to the implementation of the standard.” This should stand with Credly as the assignee as well.
    • The patents do contain information about ownership of templates, receiver acceptance, sharing, tracking views, and other technical functionality related to similar content management type of web systems. These functionalities are outside of the specific open badges specification although, without some of it, especially the content and data management aspects, it would be challenging to implement an Open Badges System
  • tags: ethical data sharing

  • tags: ethical data sharing principles community

  • tags: hofstede cultural multicultural multinational culture

  • tags: open data

  • tags: Google AI artificial intelligence machine learning machine-learning

  • tags: data ethics in IT ethics

    • We are in the midst of a “data revolution,” where individuals and organizations can store and analyze massive amounts of information. Leveraging data can allow for surprising discoveries and innovations with the power to fundamentally alter society: from applying machine learning to cancer research to harnessing data to create “smart” cities, data science efforts are increasingly surfacing new insights — and new questions.
    • Working with large databases, new analytical tools, and data-enabled methods promises to bring many benefits to society. However, “data-driven technologies also challenge the fundamental assumptions upon which our societies are built,
    • “In this time of rapid social and technological change, concepts like ‘privacy,’ ‘fairness,’ and ‘representation’ are reconstituted.” Indeed, bias in algorithms may favor some groups over others, as evidenced by notorious cases such as the finding by MIT Researcher Joy Buolamwini that certain facial recognition software fails to work for those with dark skin tones. Moreover, lack of transparency and data misuse at ever-larger scales has prompted calls for greater scrutiny on behalf of more than 50 million Facebook users.
    • individual and collective responsibility to handle data ethically. These conversations, and the principles and outcomes that emerge as a result, will benefit from being intentionally inclusive.
    • More than 100 volunteers from universities, nonprofits, local and federal government agencies, and tech companies participated, drafting a set of guiding principles that could be adopted as a code of ethics. Notably, this is an ongoing and iterative process that must be community-driven, respecting and recognizing the value of diverse thoughts and experiences.
    • The goal of this research project is to understand how legal and ethical norms can be embedded into technology, and to create tools that enable responsible collection, sharing, and analysis of data. These issues have also been a topic of discussion at multiple recent workshops.
    • Earlier this month, a workshop at the National Academy of Sciences focused on ethics and data in the context of international research collaborations. Similarly, another recent workshop on fairness in machine learning aimed to identify key challenges and open questions that limit fairness, both in theory and in practice.
    • there are powerful incentives for the commercial sector to disregard these initiatives in favor of business as usual. It is not clear how compliance and accountability could be incentivized, monitored, or enforced in both the public and private sectors, although new European Union regulations pertaining to data privacy will affect organizations globally beginning in May 2018. Both “top-down” regulations, as well as “grassroots” efforts, are increasingly raising questions about how we might define fairness, combat bias, and create ethics guidelines in data science and AI.
    • widespread adoption of ethical data collection and data analysis practices requires more than business penalties and awareness of these issues on the part of data science practitioners and the general public.
    • Boenig-Liptsin notes, “We need to understand how our values shape our data tools and, reciprocally, how our data tools inform our values.”
    • We are seeing an increasing number of data practitioners and leaders stand up and speak about the questionable and often outright illegal collection, sharing, and use of sensitive data. For their voices to drive change, and for our society to truly harness the positive impacts of data innovation, while mitigating unintended consequences, we will need a collective effort. This effort needs to reach beyond academia and policymakers, to anyone who can contribute — from the public and private sectors.
    • voice expectations for responsible data use, bringing data practitioners together to examine existing research and evidence.
    • translate findings into actionable principles — and to hold each other accountable
    • In addition to working with regulatory bodies, the shaping of social norms can transform these principles into enforceable standards for the responsible use of data.
    • Fully harnessing the data revolution requires that we not only explore what can be done with data, but also that we understand the broader impacts of how any individual or organization’s contribution affects others.
  • tags: data ethics ethics in IT

  • tags: ethics data bigdata ethics in IT

  • tags: ethics data ethics in IT

Posted from Diigo. The rest of my favorite links are here.

Saturday, October 6, 2018

Weekly Sporto bookmarks (weekly)

Posted from Diigo. The rest of my favorite links are here.