Jellyfish now helps software engineering teams benchmark against industry standards
Join the leaders July 26-28 for Transform’s AI & Edge Week. Hear high-level leaders discuss topics around AL/ML technology, conversational AI, IVA, NLP, Edge, and more. Book your free pass now!
Software engineering management platform Astonished launched what it calls the industry’s “first benchmarking tool,” a tool that allows engineering managers to check their performance against other companies.
Jellyfish Benchmarks, as the product is called, is based on the company’s own internal data, which it collects and aggregates when engineering teams agree to share their anonymized data with the wider pool.
Founded in 2017, Jellyfish’s primary mission is to align the activities of engineering teams with business objectives of companies. It does this by analyzing a myriad of engineering “signals,” gleaned from development tools such as issue trackers and source code management platforms, as well as project. It’s about establishing what teams are working on, tracking the progress they’re making, and the performance of teams and individual workers.
By ushering in aggregated, industry-wide engineering data, it brings more context to the mix, allowing companies to compare and contrast internal numbers with those of their peers across industries.
So what kind of cues does Jellyfish serve now? Users have access to over 50 metrics, including time invested in growth; problems solved; deployment frequency; merged pull requests; coding days; incident rate and mean time to repair (MTTR); among many others.
“Importantly, Jellyfish includes benchmarking of how teams allocate or invest their time and resources – this helps teams understand how they compare their time investments in innovation, support work or keeping the light on, for example,” Jellyfish product manager Krishna Kannan told VentureBeat.
At the time of writing, approximately 80% of Jellyfish customers agree to share their anonymized data in benchmark datasets and only those who will be able to benefit from this new product. To get a little, you have to give a little, that’s the general idea.
“When Jellyfish customers onboard, they have the ability to leverage industry benchmarks based on anonymized datasets from other Jellyfish customers – customers who sign up will have their data anonymized and added to the pool of Jellyfish’s reference customers,” Kannan said. “In the rare event that customers opt out of this opportunity, their data set will not be added, but they will also not be able to take advantage of benchmarking as a feature.”
While software development teams arguably have access to more engineering data than ever before, it’s not always possible to know from this data how well teams are actually performing on an ongoing basis – perhaps. be that they are doing well relative to historical numbers, but are still grossly underperforming compared to companies elsewhere. This is the ultimate problem that Jellyfish Benchmarks seeks to solve.
It’s also worth noting that Jellyfish rival LinearB offers something similar in the form of Engineering benchmarks, covering nine metrics. However, Jellyfish says it responds to dozens of metrics, which could open up the utility to a wider range of use cases. *
“The reality we’ve seen is that different teams are looking to optimize for different metrics depending on their product, stage, business goals, etc.,” Kannan said. “That’s why we’ve included benchmarking for the metrics that matter most to our customers.”
*Update to correct a previous statement suggesting that LinearB’s benchmark product was not fully integrated with its platform.
VentureBeat’s mission is to be a digital public square for technical decision makers to learn about transformative enterprise technology and conduct transactions. Learn more about membership.