Skip to main content

It is apparent to technologists in the information publishing and provision community that authentication methods currently in use by participant content providers and institutions are far from ideal. The overwhelming majority of institutions providing content to patrons use IP-based authentication, which is based on the address system used by every connected device on the Internet. If a user comes from a specific address or range of addresses, then the system allows access; if not, they are blocked. In theory, this should work seamlessly, but in reality, there are many holes and problems with the set up.

Some of the many problems with this type of authentication system have been obvious since it was first developed, while others have become more pressing in recent years. IP addresses are relatively easy to spoof and the content provider never knows who is on the other side of the address; it must simply trust that the user is authorized. As Internet access became more ubiquitous and institutional users could connect from various places, it became necessary to authenticate those not directly linking via their institutional network. The development of proxy servers, which authenticate users before passing them forward to their desired content, provided a solution to this authorization issue.

Often, because users don't recognize IP authentication controls (which are designed to be invisible to users), the proxy systems are viewed as a barrier to entry and a cause of frustration. Proxy systems aren't inherently insecure, but their security is dependent on proper implementation as described in a talk last month by Don Hamparian from OCLC, provider of the most widely adopted proxy system is the library community, EZProxy. Especially as more and more users access content via mobile appliances, the challenge of authenticating devices not running directly through an institutional network has grown exponentially.

Because of some significant security breaches and subsequent data losses, a variety of significant players that are focused on backbone services and that highly prioritize security are pushing forward with more advanced security protocols. In addition to using multi-factor authentication, Google, for example, has begun testing new strategies for authentication to improve security. One hopes that these newer approaches will gain wider adoption.

Authentication has become an area of focus for content providers, which have seen a rise in piracy that takes advantage of loose security and authentication systems. In some ways, these pirate systems, such as Sci-Hub or LibGen, exploit holes in the security systems that publishers and libraries have put in place. Some hacking of usernames and passwords is caused by phishing or other credential breaches, for which the solution is education and better security practices. On the other hand, content security is destroyed by the willingness of some in the academic community to "donate" their log-in credentials. No security system, regardless of how well it is architected, can solve the "1D-10-T" problem presented by those willing to share their credentials with hackers in Kazakhstan.

These various problems are motivating many in the community to begin conversations about improving authentication. However, the process of development and implementation of such advancements is not simple. Publishers and institutions, each reliant on the other and each reluctant to move first, cannot impose their improvements to authentication issues by themselves. Plenty is at risk for institutions, libraries, and publishers, making it high time that the full community start serious conversations about creating more complete solutions for these issues facing us now.

Sincerely,

Todd Carpenter

Executive Director

NISO Reports

New and Proposed Specs and Standards

ISO Releases New Series of Standards on Mobile Payments

Mobile banking involves banking systems, phone types, and commercial and private parties. The ISO 12812 series of standards and technical specifications defines related terms, discusses technical aspects of the transactions, and defines the roles of the various parties to them. The series, being developed by working group 10 of ISO/TC 68/SC 7, includes:

  • ISO 12812-1, Core banking - Mobile financial services - Part 1: General framework
  • ISO/TS 12812-2, Core banking - Mobile financial services - Part 2: Security and data protection for mobile financial services
  • ISO/TS 12812-3, Core banking - Mobile financial services - Part 3: Financial application lifecycle management
  • ISO/TS 12812-4, Core banking - Mobile financial services - Part 4: Mobile payments-to-person
  • ISO/TS 12812-5, Core banking - Mobile financial services - Part 5: Mobile payments to business.
» Go to story

W3C's Data on the Web Best Practices Working Group Releases Drafts for Comment

The World Wide Web Consortium's (W3C) Data on the Web Best Practices Working Group has released three related drafts: "Data on the Web Best Practices" and the complementary "Data on the Web Best Practices: Dataset Usage Vocabulary" and "Data on the Web Best Practices: Data Quality Vocabulary." The aim of the material is to describe how data can best be shared, whether openly or not; how to structure citations, comments, and uses of data; and how to discuss data quality.

» Go to story

Thema version 1.2 Readies for Release

EDItEUR reports that the Thema International Steering Committee agreed most of the final details of the forthcoming revision to the book subject classification scheme during the London Book Fair in April. The committee has made available draft minutes from the meeting. The Technical Working Group focused on requirements that emerged from the increasing adoption of Thema. Examples of changes, explains EDItEUR, include that "A particularly interesting new 'narrative theme' has been introduced into the Fiction section at FXR, which in the global English language version of the scheme will be known as 'sense of place.'

» Go to story

ONIX 3.0.3 revision now available

ONIX 3.0.3, which represents a minor revision ratified at the London Book Fair, was recently published on theEDItEUR website. EDItEUR explains that, "This new release maintains the established cadence of minor revisions which add optional new functionality every two years. It broadens the range of metadata elements that can be carried in an ONIX message, meets some new or expanded requirements, and avoids adding unduly to the complexity of ONIX."

» Go to story

Media Stories

STM Future Trends 2020 Provides Publishers Insights on the Currents Affecting Them

The International Association of STM Publishers' STM Future Trends for 2020 was released on April 28. During the Annual U.S. meeting of STM, NISO Executive Director Todd Carpenter sat down with Eefke Smit, the Director of Standards and Technology at STM, and Sam Bruinsma, Senior Vice President Business Development at Brill and chair of the STM Future Lab Committee, to discuss the team's output.

» Go to story

Linking Publications and Data: Challenges, Trends, and Opportunities

"This report outlines findings from a workshop titled 'Data & Publication Linking' held January 5, 2016 in Washington, D.C., funded by the U.S. National Science Foundation's (NSF) Open Access & Open Data initiative, and the NSF's EarthCube initiative. The workshop convened a discussion on the challenges and opportunities for cross-linking data and publication repositories."

» Go to story

A Scholarly Divide: Social Media, Big Data, and Unattainable Scholarship

Social media is producing a huge amount of raw data that should be a gold mine for researchers. Most don't know how to use it, though, resulting in an unfortunate divide between data science haves and have-nots.

» Go to story

How Large is the 'Public Domain'?: A Comparative Analysis of Ringer's 1961 Copyright Renewal Study and HathiTrust CRMS Data

Reports vary about how many titles are in the public domain in this country. This paper explains that Barbara Ringer's "Study No. 31: Renewal of Copyright" (1960) found that 93 percent of titles published in the United States between 1923-1963 are public domain, whereas the IMLS-funded Copyright Review Management System (CRMS) project estimated that the figure was approximately 50 percent. Wilkin notes that "A better understanding of the size of the public domain, gaps in the portion of the public domain that has been digitized, the specific characteristics of the in-copyright corpus, and the problems and opportunities in the remainder can help drive digitization and rights clearance efforts." He therefore sets out to give an accurate figure.

NISO Note: University of Illinois, Urbana-Champaign Library is a NISO LSA Member.

» Go to story

Open Research Unlocks Career Opportunities: An Interview Featuring Meredith Niles

Open access is great for research, but how can you work it into your career plan? In this podcast. Jen Laloup, Editorial Media Manager for PLOS, interviews Meredith Niles about how Niles used open research on her path to becoming an assistant professor of Nutrition and Food Sciences at the University of Vermont and a member of the Board of Directors at PLoS.

» Go to story

Dear Colleague Letter: Seeking Community Input on Advanced Cyberinfrastructure

The National Science Foundation's (NSF) Advanced Cyberinfrastructure (ACI) Division "supports and coordinates the development, acquisition, and provision of state-of-the-art cyberinfrastructure resources, tools, and services essential to the advancement and transformation of science and engineering." Starting in 2013, NSF positioned the OCI within the Directorate for Computer and Information Science and Engineering. Now that this arrangement has been in place for several years, the foundation is assessing the situation, and seeks science and engineering community input on several questions.

» Go to story

Reality Check on Reproducibility

A brief online survey of 1,576 researchers shows that most are concerned with the issues of reproducibility in science, says Nature. "Pressure to publish, selective reporting, poor use of statistics and finicky protocols can all contribute to wobbly work," says the journal, but one-third of respondents explain that they're already taking steps to combat the problem. Some have instituted workflows that avoid a particular researcher having too much control over results, and others have learned to analyze data collaboratively, for example.

» Go to story

Perspectives on Big Data, Ethics, and Society

The Council for Big Data, Ethics, and Society was established in 2014, and since then, says this white paper, its "reports, meetings, and ongoing conversations have consistently indicated that there is a disjunction between the familiar concepts and infrastructures of science and engineering, on the one hand, and the epistemic, social, and ethical dynamics of big data research and practice, on the other." The paper synthesizes the themes and issues that have emerged over the two years and describes some solutions that have been found, but, even more, promises a fascinating future for the Council and its future work.

» Go to story

LYRASIS and DuraSpace Announce Dissolution of "Intent to Merge"

Not-for-profit organizations LYRASIS, which assists libraries, archives, and museums with content creation, and DuraSpace, which offers open-source repository software, have announced that they will not merge. The decision follows months of investigation and community input, including a membership town hall in May, after the organizations announced in January their intent to combine. Both LYRASIS and DuraSpace will now continue their current operations.

» Go to story

World Wide Web Consortium (W3C) and International Digital Publishing Forum (IDPF) Explore Plans to Combine

Tim Berners-Lee, Web Inventor and W3C (World Wide Web Consortium) Director, and Bill McCoy, IDPF (International Digital Publishing Forum) Executive Director, announced at IDPF DigiCon at Book Expo America 2016 in Chicago that the two organizations intend to combine. While W3C will continue to develop the EPUB standard, the new organization will work to align the publishing industry with Web technology. The next steps are to solicit comments from the memberships of W3C and IDPF, and if the members agree, move forward with legal review and other practical details, with the goal of a combined organization by January of 2017.

» Go to story