Computer Science Department
Automated Execution and Data Collection for Games
A fundamental problem currently exists in the field of computing metrics, in that while the tools exist to adequately measure many aspects of computing performance in a process called benchmarking, these tools are often haphazardly built at the user-level. Because of this tradition of poor user interface design, such tools often require an unreasonable amount of user-interaction to use, and many of their capabilities are abstracted behind poorly documented commands. As a result of these poor design choices, these otherwise useful have a steep learning curve for learning how to use and continually effectively use them, tying up the time of such users that could be better spent elsewhere. The goal of this project is to abstract and automate the use of these tools, which can be difficult to use collectively, so that users might spend their time among more productive tasks. This will be accomplished by the creation of a graphical application that can interface with multiple performance-benchmarking programs, executing each one as desired by the user, and then aggregating the results of these various programs in to a single easy-to-understand interface. Given the large number of these benchmarking applications, the scope of this tool will be limited to the automation of seven video games that can be utilized as benchmarking applications.