PerfKitBenchmarker

From HandWiki
Short description: Type of computer benchmarking tool

PerfKit Benchmarker is an open source benchmarking tool used to measure and compare cloud offerings. PerfKit Benchmarker is licensed under the Apache 2 license terms. PerfKit Benchmarker is a community effort involving over 500 participants including researchers, academic institutions and companies together with the originator, Google.

General

PerfKit Benchmarker (PKB) is a community effort to deliver a repeatable, consistent, and open way of measuring Cloud Performance. It supports a growing list of cloud providers including: Alibaba Cloud, Amazon Web Services, CloudStack, DigitalOcean, Google Cloud Platform, Kubernetes, Microsoft Azure, OpenStack, Rackspace, IBM Bluemix (Softlayer). In addition to Cloud Providers to supports container orchestration including Kubernetes [1] and Mesos [2] and local "static" workstations and clusters of computers [3].

The goal is to create an open source living benchmark [framework] that represents how Cloud developers are building applications, evaluating Cloud alternatives, learning how to architect applications for each cloud. Living because it will change and morph quickly as developers change.

PerfKit Benchmarker measures the end to end time to provision resources in the cloud, in addition to reporting on the most standard metrics of peak performance, e.g.: latency, throughput, time-to-complete, IOPS. PerfKit Benchmarker reduces the complexity in running benchmarks on supported cloud providers by unified and simple commands. It's designed to operate via vendor provided command line tools.

PerfKit Benchmarker contains a canonical set of public benchmarks. All benchmarks are running with default/initial state and configuration (Not tuned to in favor of any providers). This provides a way to benchmark across cloud platforms, while getting a transparent view of application throughput, latency, variance, and overhead.[1]

History

PerfKit Benchmarker (PKB) was started by Anthony F. Voellm, Alain Hamel, and Eric Hankland at Google in 2014. Once an initial "alpha" was in place Anthony F. Voellm and Ivan Santa Maria Filho built a community including ARM, Broadcom, Canonical, CenturyLink, Cisco, CloudHarmony, CloudSpectator, EcoCloud@EPFL, Intel, Mellanox, Microsoft, Qualcomm Technologies, Inc., Rackspace, Red Hat, Tradeworx Inc., and Thesys Technologies LLC.

This community worked together behind the scenes in a private GitHub project to create an open way to measure cloud performance. This community released the first public "beta" was released on February 11, 2015, and announced in a blog post at which point the GitHub project was open to everyone. After almost a year and with large adaption (600+ participants on GitHub) the V1.0.0 was released along with a detailed architectural design on December 10, 2015.

Benchmarks

A list of available benchmarks from PerfKitBenchmarker: (The latest set of benchmarks can be found at GitHub readme file.[2])

Big Data / IoT High Performance Computing

Scientific Computing

Simulation Web benchmarks
Workloads - Aerospike YCSB

- Cassandra YCSB

- Hadoop Terasort

- HBase YCSB

- MongoDB YCSB

- Redis YCSB

- HPCC

- Scimark2

- OLDIsim

- etcd

- EPFL CS Web Search

- EPFL CS Web Serving

- Tomcat

Storage benchmarks CPU benchmarks Network benchmarks System
micro-benchmarks

- Bonnie

- File Copy

- Fio

- Google Cloud BigTable

- Object Storage

- Synthetic Storage

- Sysbench OLTP

- Coremark

- Spec CPU 2006

- Iperf

- Mesh Network

- Netperf

- Ping

- Cluster Boot

- Unixbench

Industry participants

Since Google open sourced the PerfKitBenchmarker, it became a community effort from over 30 leading researchers, academic schools and industry companies. Those organizations include: ARM, Broadcom, Canonical, CenturyLink, Cisco, CloudHarmony, Cloud Spectator, EcoCloud@EPFL, Intel, Mellanox, Microsoft, Qualcomm Technologies, Rackspace, Red Hat, and Thesys Technologies. In addition, Stanford and MIT are leading quarterly discussions on default benchmarks and settings proposed by the community. EcoCloud@EPFL is integrating CloudSuite into PerfKit Benchmarker.

Example runs

Example run on Google Cloud Platform

$ ./pkb.py --cloud=GCP --project=<GCP project ID> --benchmarks=iperf --machine_type=f1-micro

Example run on AWS

$ ./pkb.py --cloud=AWS --benchmarks=iperf --machine_type=t1.micro

Example run on Azure

$ ./pkb.py --cloud=Azure --machine_type=ExtraSmall --benchmarks=iperf

Example run on Rackspace

$ ./pkb.py --cloud=Rackspace --machine_type=standard1 --benchmarks=iperf

Example run on a local machine

$ ./pkb.py --stack_vm_file=local_config.json --benchmarks=iperf

References