TopTop

Shadow

Search

advanced search
search help

 

ipHandbook Blog

Your source for expert commentary on IP management issues.
Go to the blog

 

About

Editor-in-Chief,   Anatole Krattiger

Editorial Board

Concept Foundation

PIPRA

Fiocruz, Brazil

bioDevelopments-   Institute

CHAPTER NO. 6.11

Sloman RG. 2007. Technology Transfer Data Management. In Intellectual Property Management in Health and Agricultural Innovation: A Handbook of Best Practices (eds. A Krattiger, RT Mahoney, L Nelsen, et al.). MIHR: Oxford, U.K., and PIPRA: Davis, California, U.S.A. Available online at www.ipHandbook.org.

Editors’ Note: We are most grateful to the Association of University Technology Managers (AUTM) for having allowed us to update and edit this paper and include it as a chapter in this Handbook. The original paper was published in the AUTM Technology Transfer Practice Manual (Part II: Chapter 4).

© 2007. RG Sloman. Sharing the Art of IP Management: Photocopying and distribution through the Internet for noncommercial purposes is permitted and encouraged.

Technology Transfer Data Management

Robert G. Sloman, CEO, Inteum Company LLC, U.S.A.

Show SummaryEditor's Summary, Implications and Best Practices

Abstract

A technology transfer office must be able to manage enormous amounts of dynamic data. This chapter examines how electronic file systems can meet this need, focusing on the importance of shared communication links and the benefits of using advanced spreadsheet applications developed by the private sector. It considers the relative merits of spreadsheets, flat file databases, and relational databases, and highlights the numerous benefits of a network solution. The chapter explains how to ensure data integrity and manage “analysis paralysis” in such systems, and it offers a self-questionnaire to guide decisions about adopting a software management solution.

1. Introduction

Managing a technology transfer office (TTO) requires strong administrative, technical, and communication skills. To make informed decisions, a tremendous diversity of information needs to be captured and analyzed. A TTO’s ability to handle this information is complicated by how rapidly new information becomes available. Moreover, the average academic TTO usually has limited funds and staff with which to create such a sophisticated data management system.

Meeting these challenges and making timely, informed decisions can be very rewarding. However, as workflow increases, the ability to maintain a high standard of decision making can be compromised. If the TTO is a closed system, and no additional professional or support resources can be acquired to deal with the additional workflow, other solutions must be found. These solutions will very likely involve fundamentally changing how the office uses its available tools.

Fortunately, being one or more generations behind in implementing data management and decision support software systems does not translate into years of catch-up for the TTO. TTOs can reap the rewards of corporate investment in these areas. For more than a decade, companies have collectively spent many millions of dollars experimenting with executive decision-support software and management information systems that were designed to get information to organizations quickly and thus increase efficiency and facilitate rapid response. These objectives apply equally well to TTOs.

Airline-booking applications are good examples of large, end-user friendly, real-time information systems. Much has been learned about software design since the first implementation of such systems, resulting in more accessible applications that conform to the workflow logic of the end user. While early linear programming efforts proved inaccessible to end users, modern software application design is event driven and object oriented.

As computer prices have plummeted, the power and sophistication of hardware computing have increased dramatically. Decommissioning exotic mainframe computers because of their exceptional maintenance and professional support costs, companies are now implementing enterprise computing models on local area networks (LAN) of workstations, sharing resources from a file server.1 The computing advances pioneered in the corporate realm—specifically, improved efficiency, reliability and work throughput—are now available to TTO managers.

2. Physical or Electronic Files?

2.1 Considerations

A resource for shared information should be accessible to those who need the information in order to make decisions. Often, technology transfer and intellectual property (IP) management decisions depend on a mix of variables, including information about the inventors, their ongoing research programs, the companies interested in the technology, the relevant patent applications and their status, and the amount of money invested in each technology transfer case. In this complex environment, electronic data management systems provide the most rapidly adaptable support tool.

Physical files suffer from some fundamental limitations. In a TTO, records (or documents) are generally filed by case or technology according to the manager’s guidelines. The technology transfer manager will probably find physical files limited and difficult to maintain because there will be only a single physical copy—unless staff members make multiple copies of files and place them in related areas. The person doing the filing makes a judgment about where best to file each document. This is why a manager may routinely find information in the “wrong place”—or not find it at all. A manager may apply certain rules for filing documents, but the rules are generally complex and loose, and therefore are frequently bent or misapplied. Often, a technology transfer manager must review an entire file to find the information in question. Another problem with physical files is the time it takes for information to be processed and correctly filed. If files are not up-to-date, a technology transfer manager may be forced to wade through stacks of paperwork to find a needed piece of information.

With an electronic system, however, a job packet can be quickly delegated to an officemate. All case-related data and activities can be transferred easily, with instructions, to another manager or to support staff. This is the electronic equivalent of handing a physical file to a person with the necessary instructions and briefing information. With a physical file, the recipient may miss relevant action items. However, with an electronic file, the previous manager can easily transfer a variety of action items associated with that case to the new manager.

Of course, one of the most compelling reasons to use a state-of-the-art data management system is the unprecedented ability to interrogate enterprise-wide data creatively. A manager can now rapidly formulate questions that in a physical file environment would be unthinkable due to the time required to assemble and analyze the information sets.

2.2 Connectivity

The key to achieving connectivity through networked computing environments is to create shared communications links, including e-mail facilities and a shared information pool. No alternative method achieves the degree of connectivity offered by a networked environment. Indeed, networked computing environments can develop connectivity between the files themselves in a way that is not possible with physical files. For example, a technology transfer manager can check to see if contact has been made with a particular company or individual, regardless of what case that contact is associated with. The labor required to accomplish this task with physical files would be prohibitive.

3. Finding the Best Tool for The Job

3.1 Computer applications

3.1.1 Spreadsheets

Financial modeling tools, called spreadsheets, were the first applications developed for the PC. Since the release of Visicalc™, the first widely used spreadsheet, many generations of powerful analytical tools have been developed. (A secondary market has developed for templates. These add utility by providing spreadsheet layouts and built-in algorithms, enabling plug-and-play simplicity. Unfortunately, few of these templates are useful for the technology transfer professional.)

When a technology transfer manager is seeking to generate graphs from data for reports, the spreadsheet has no equal. Users can create relationships between different spreadsheets, allowing data to be shared and linked from one sheet to another. However, users who have tried to create complex links between several layers of spreadsheets know that this can be a complex task, tantamount to programming. Unfortunately, because of the soft nature of the links, they can become corrupted. One corrupt spreadsheet cell, or one with a pilot error,2 can be copied into other spreadsheets with catastrophic results. Such errors, moreover, are difficult to trace.

Of course, spreadsheets are useful for budgeting and license revenue forecasting. They are well understood and provide dramatic visual outputs, such as graphs. The modern spreadsheet is capable of conducting “what if?” scenarios that can be particularly useful when attempting to forecast patent maintenance fees, for example. Some of these packages also contain rudimentary database-like functions that create screens for data entry. However, the sheer size and complexity of spreadsheets make them difficult to program. In addition, they do not compare favorably in this area to purpose-built database products.

Some very sophisticated, complex systems using Microsoft’s Excel® and other software products have been developed by TTOs. Sharing these systems is encouraged, since the time required for designing linked spreadsheets suitable for managing the forecasting and budget processes is daunting.

3.1.2 Flat file databases

Flat file databases create an environment where the user can create records with data about a particular class of event or package of information. For example, records on a technology and the data elements directly related to it may be contained in a single record. Patents, however, would be in a separate database file. In a flat file database, therefore, a user would need to consult first one database and then the others in order to connect the data in meaningful ways. Because a programmer or user can change the data structure of a particular table, these databases are quite flexible. Moreover, they can also be changed without upsetting relations with other databases. In short, flat file databases have the benefits of design simplicity, ready recognition by end-users, and flexibility.

Though navigation is straightforward in a flat file database, the burden is on the user to look in the right place. There are other disadvantages. Generally, the end user must purchase a flat file database engine and then design his or her own system. Experienced users of flat file databases work out routines and patterns of interrogation at which they become adept; new users, however, may have a problem navigating around these systems with sure-footedness.

In addition, reporting from a flat file database is difficult because the links required to bring information together can be as complex as those used to link cells in spreadsheets. If a technology transfer manager is contemplating a flat file database structure, she or he should consider preferred report design and useful templates, which will reduce some of the complexity.

3.1.3 Relational databases

Relational databases contain a group of tables with various aspects of the information base coded together or hard-linked to other tables. A data-input screen may draw on a number of tables to show information in a pseudorelational mode. In a truly relational database design, however, there must be one or more linking fields between tables.

Technology transfer managers require access to data on finances, faculty, patent prosecution, and marketing contacts, among other things. Each functional data element might be contained in a separate data management resource, but this would be inefficient. In programming parlance, access to backroom (detail) data is important, but technology transfer managers increasingly value data that can be easily navigated without any knowledge of the underlying data structure. A relational database system can accommodate this need.

Relational database systems permit the manipulation of larger sets, such as the technology portfolios for each manager and each department, among myriad other selectable criteria. Transferring sets of physical files would require a review of the file and, probably, a briefing from a previous manager. With a relational database, one can transfer the entire project from one manager to another, enabling a more efficient transfer of action items and information than is possible with physical files. This maximizes the use of professional management talent, for example, if one manager needed to focus attention on other urgent projects, such as infringement support, cases could easily be temporarily redeployed with a relational database tool.

The inherently rigid structure and connectivity of data in a relational database gives unprecedented power to look at the data and business models in different and creative ways. Exception reports, run with some frequency, can rapidly show where data gaps exist, which can drive administrative projects. Managers can forecast expenses and revenues to isolate a variety of parameters and determine if divisions are real. The ability to conduct nearly instantaneous audits can help managers plan office activities, and this connectivity also enables a supervising manager to evaluate the performance of technology transfer managers using data management systems.

Some argue that a disadvantage of a relational database is that it uses a rigorous data structure that does not allow variability. However, a rigid data structure is essential if a technology transfer manager wants to get reliable results from an electronic interrogation. To accommodate the real need for free-form annotations, it is possible to provide note or memo fields in which special details can be recorded. Indeed, a technology transfer manager should look for a balance between rules and flexibility when selecting or designing a relational data management system.

In some relational database models, connectivity is enhanced by regularly downloading recent data that can be read and interpreted by all office members. This works best when the office eschews a hierarchical structure. If the office director, managers, and support staff are electronically briefed about cases and contacts, then meetings can proceed more efficiently, and briefing sessions can be shortened or eliminated. When meetings do occur, it is more likely that decisions can be made with confidence; those who are not directly involved in the case may still have sufficient information to contribute useful ideas. Also, when support staff is kept current they can plan their workflow more efficiently.

In relational database design, there are rules that describe how data should be “normalized.”3 Rigid rules dictate elegance and resource efficiency. For transaction-based databases, the design can be optimized to increase the speed of recording a sales transaction or stock movement. Alternatively, the design can be optimized for ready access to a large pool of related data. This latter version most conforms to the needs of a technology transfer management information system. The reason is simple: technology transfer decisions are based on complex, variable information. A technology transfer manager requires access to a range of information, including IP status, commercial contacts, expenses, and other information. The transaction- and related-data design paradigms, however, need not be mutually exclusive. In other words, even if the demand for data interconnectedness dominates, the goal of high-speed response need not be abandoned.

When thinking about the complexity of technology transfer data management requirements, the relational database is the engine of choice because it requires less data entry and can be easier to maintain and audit. With expert programming code, a relational database can quickly present the information a technology transfer manager needs. Because the complexity of the data sets requires these powerful and capable computing tools, the commercial databases used by the technology transfer community are all relational or pseudorelational database engines.

One perceived disadvantage of licensing an independent vendor’s technology transfer management system is that the vendor controls the structural design. That is, during the next generation of offerings, additions will invariably arise, and the end user is not able to modify the data structures as needed. Viewed from the perspectives of the vendor and licensee, there are excellent reasons for this limitation. The cost of developing the code generated for such applications frequently involves many thousands of dollars, as well as years of careful thought and programming. The investment in programming code of this type can cost in excess of US$200,000!

3.2 Network solutions

All of the above database tools can be shared over a local area network (LAN). However, only relational databases can function reliably in multiuser mode, with a number of users accessing the same data pool simultaneously, without fear of data corruption. For example, on a LAN, if a technology transfer manager were to open a spreadsheet file that someone else had on his or her screen, the manager would either receive an error message indicating that the file was in use or be advised that it was available in read-only mode. In the later scenario, any changes made would be lost. More accurately, they would be saved but then overwritten by the person who had the file open first and saved it. Flat file databases may be problematic in the same way.

Relational databases have built-in record locking and transaction-tracking features that control the access to shared files and the procedures used to update data. Many TTOs associate networks with the Internet. This chapter, however, is addressing LANs, a computing environment where one computer acts as the file server for client workstations. LAN technology has advanced dramatically in the last several years, with a number of well-supported systems available. Even for small TTOs, the advantages of using a LAN in combination with a relational database are remarkable.

3.3 Data portability

Most software applications are able to export and import data. The advantages of data portability are evident. If a technology transfer manager can enter data in one application and transport it in an organized fashion to a different application, data doesn’t have to be entered twice. Rekeying data not only wastes time but also increases the likelihood of data integrity problems if data is recorded differently in two places (for example, if the date of receipt of funds from a licensee or the response due date for a patent application office action is wrongly entered).

It is important to use the most appropriate tool for a given job. Relational databases are the best all-around data management tool. Spreadsheets are a good tool for financial analysis and graphics. A technology transfer manager may choose to use a relational database engine to store data and then export data to a spreadsheet for manipulation and graphing.

Relational database engines are at the core of all commercially available accounting packages. An increasing number of commercially available accounting packages are designing their database file structure to be compatible with DBase®. DBase data file structures, in turn, are an example of so-called XBase data structures. When the data structures between two applications are equivalent or compatible, fewer steps are required to translate data between them. So, if an accounting package with DBase-compatible data structure is used, it would be advantageous to choose a management information system with a compatible data file structure. DBase data file structure is currently supported and promoted by two of the leading proprietary relational-database engine suppliers. Accordingly, a technology transfer manager should be aware that not all relational database engines are compatible with DBase.

3.4 Data distribution

Data distribution means providing rapid access to current information to precisely those people who require it to make informed decisions. The ease with which data can be queried will determine how often the database is used by the technology transfer staff. With the power of relational database engines and the connectivity of a LAN, designs that can be easily interrogated by end users are now possible.

The technology transfer manager should view the investment in the acquisition of a system and the time spent in data entry as an asset in production. This system data should be fully utilized by the technology transfer manager to coordinate office activities and generate reports sequentially or on an ad hoc basis.

3.5 Paradigms for data management

The main design paradigms for technology transfer management information systems are driven either by (1) committee and administration or (2) end-user functionality. System designs that are driven by the former usually prioritize the design of report outputs. Administration, for example, may announce, “We want a monthly report showing which patent applications are due for a maintenance fee, sorted by the technology licensing manager.” As a result, a table structure may be defined and a report written to support this management objective. But while defining objectives is important, this approach may create conflicts in terms of data structure. To create a design of this type requires the consideration of all the ways the data may be interrogated, while at the same time avoiding massive data duplication, rekeying, or excessive look-up requirements that slow a system down.

If the system is designed around the very specific interrogatory output paradigm, the administrative objectives will be supported, but the ease of use for end users will be diminished. When a management information system provides little end-user functionality, it will not be kept as current as one that does. With daily functionality, end users more easily navigate around other parts of the system. Even though most users will spend 80% or more of their time in a single module, they will be familiar enough with navigation techniques to find their way to other relevant sections when the need arises.

A technology transfer manager may want to opt for a system designed first for the end user, but with powerful and flexible administrative report functions. The design goal should be to create a system that acts as a partner in real time, so that data is entered as the workday unfolds. If users enter the data as they move along during their day, data entry is more current and accurate. In addition, the time burden decreases and the sense of accomplishment is enhanced.

4. Data Integrity

4.1 Assigning data-entry tasks

For day-to-day contact functions, users should have the flexibility to use the system in a way that supports their work habits. Relying on technology transfer managers to complete data entry on their patents and licenses may not be the most effective use of their time. Rather, this task could more efficiently be delegated to the individual responsible for administering the contracts or to an experienced administrative staff person. It is desirable that a single individual be delegated the responsibility of entering specific sections of the data (for example, the data on patent prosecution and revenues and expenses for each technology or case). This approach reduces the likelihood of errors and data duplication. In general, a management information system should allow an administrative support staff member to easily complete such data entry.

4.2 Auditing

It is preferable to conduct audits of the information in all environments. Reports can accomplish this function and can be set up to run at certain intervals or to run on an as-needed basis. In addition, for truly mission-critical information, reports should be created and submitted to outside professional service providers for periodic review. An example might be generating reports from the database with current information about a particular patent prosecution and presenting that report, or portfolio of reports, to the patent attorney. Staff could then request that the attorney update the report.

One direct and immediate benefit of this approach is improved data integrity. Another benefit is that service providers may come to understand how much information about a university’s technology transfer assets, patent applications in this case, is valued.

If a technology transfer manager is interested in implementing such a review, doing so on a rotating basis, rather than as a direct audit of all records, may be sufficient and would reduce incremental costs.

5. Analysis Paralysis

The term analysis paralysis is being used here to describe a period of time when an office shuts down operations, virtually stopping all services, to allow the staff time to update, analyze, modify, and discuss the technology data. This process can be an excellent educational experience for an entire office staff. Generally, teams should be planned in advance and assigned a batch of technology or case files to find answers to predefined questions. This process can help define the office’s future mode of operations and may uncover areas in need of attention. If the entire staff is engaged in the process, a sense of team building may be achieved.

Through this process, a technology transfer manager may be able to anticipate questions from the university’s administration. Moreover, if all technology staff are involved in the production and interpretation of the data, experts among the staff may emerge in different fields. And finally, periodic analysis of data results allows for a faster response when a quick, unexpected analysis is needed. This “time out” might seem an impossible goal, but the rewards can far outweigh the cost.

6. Evaluating Software Solutions

If a technology transfer manager is going to adopt a software management set of solutions, this author suggests taking the process to its most advanced state possible. In determining suitability, a number of questions should be asked (see Box 1).

The decision to design a system or acquire a commercially available software package to manage technology transfer data should be based on the TTO’s needs. Like all computer solutions, the system will be only as good as the people

BOX 1: KEY QUESTIONS FOR DECISION MAKERS IN EVALUATING SOFTWARE SOLUTIONS

  1. 1. How suitable to the task is the software solution?

    The solution recommended in this chapter is not cheap, especially when a technology transfer manager considers the cost of a LAN, a commercially available package, and training.

  2. 2. Is adopting the software solution worth the investment of both money and staff time?

    Only the technology transfer manager can answer that question, taking into consideration all variables of the university and the TTO. A technology manager may want to consider the following advantages of incorporating a software solution:

    a) Managers with ready access to current data can work faster and with greater accuracy and can make decisions with increased confidence.

    b) Staff will be more likely to bring important issues to the attention of the supervising technology transfer manager, and necessary interventions will more likely occur.

    c) As a training tool for new technology licensing managers, the software tools described in this chapter can create an environment where staff can work more efficiently, with fewer work projects falling behind schedule.

    d) Software solutions can increase responsiveness to clients and the ability to analyze workflow and make appropriate resource allocations.

  3. 3. Why is time being spent in entering the data (as opposed to completing the day-to-day functions)?

    One possible response to this question is that data entry creates a work environment where relevant data can be readily accessed when needed by users, managers, and support staff so informed decisions can be made in a timely fashion.

using it. Therefore, a final consideration when purchasing or developing any system is the likelihood that staff will actually use the software. It is not necessarily true that staff who collectively design a system will be more likely to use it. This may sound counterintuitive, but it is based on our real-world experience.

7. Conclusions

A key element in developing a data management system is setting clear goals of effective data management. The technology transfer manager should have information to support the essential tasks of the office staff in both tactical and strategic modes. Tactical support means ensuring ready and current access to information about all aspects of a particular case. The strategic mode demands the presentation of information that can illuminate trends and assist in office organization, workflow distribution, and planning. Other examples of such data use include revenue forecasting and cash-flow planning. While cash flow may not be a prominent issue yet in all academic TTOs, the cost of doing business in the field of technology transfer is increasing rapidly, and cash-flow planing may soon become imperative.

Data management tools should act in concert with the goals of managers and adapt to the way managers work, instead of requiring users to adopt a certain pattern of processing information. Regimentation of data is important, but this need not create a barrier to end users.

It also is important to think ahead and design an application for the future. As programming tools and desktop computers have become more powerful, workgroup software with event-driven, rather than programming-driven, applications have emerged in full graphical user interface presentation formats. The industrial relational-database literature reveals that the focus of applications development is moving away from the exotic hardware of the mainframe and minicomputer and toward the client-server model of distributed computing environments such as LANs.

The TTO management experience is relatively fresh, and the cost of failing to professionally manage data is not yet widely recognized. Examples of such costs include large, unpaid obligations that persist because of inefficient methods for collecting revenues, or poor management of a technologies portfolio. Both of these situations could result in real costs to the TTO, although it may take several years for this to become evident.

With a properly designed and implemented software solution, a manager can decide with greater confidence that the data needed to support a decision are at hand. Allowing managers and staff to be more responsive to clients, data management systems solutions can also dramatically enhance the general professionalism of an office.

Endnotes

All referenced Web sites were last accessed between 1 and 10 October 2007.

1 A file server is a high-powered personal computer linked by communication cables to computer workstations. The file server provides storage of shared data files and software applications, as well as printer sharing capabilities.

2 A spreadsheet “pilot error” is a data entry error made in an algorithm or data cell that causes erroneous results.

3 “Normalized” data has been organized into relationships in a way that seeks to minimize duplication of data and maintain data integrity.

Sloman RG. 2007. Technology Transfer Data Management. In Intellectual Property Management in Health and Agricultural Innovation: A Handbook of Best Practices (eds. A Krattiger, RT Mahoney, L Nelsen, et al.). MIHR: Oxford, U.K., and PIPRA: Davis, California, U.S.A. Available online at www.ipHandbook.org.

Editors’ Note: We are most grateful to the Association of University Technology Managers (AUTM) for having allowed us to update and edit this paper and include it as a chapter in this Handbook. The original paper was published in the AUTM Technology Transfer Practice Manual (Part II: Chapter 4).

© 2007. RG Sloman. Sharing the Art of IP Management: Photocopying and distribution through the Internet for noncommercial purposes is permitted and encouraged.