We’re not looking for mistakes. We see them right away!

Benefits of working with auticon in Quality Assurance & Testing projects


Quality assurance testing is the process of ensuring an organization’s software or websites (or web-based applications) is working correctly and free of bugs. Quality Assurance Analysts stop problems before they begin, identify changes to the user interface, and recommend problem-solving new features before the software is released to the public. Quality Assurance can include the development of Test Case Automation and Regression Testing.

auticon consultants have unique skills that make them exceptional testers – with the ability to notice patterns, identify anomalies, and focus over extended periods with little mental fatigue: Rather than looking for errors, auticon consultants intuitively see errors. Rather than merely carrying out repetitive tasks, they complete these with enthusiasm, exceptional attention to detail and sustained concentration. Bugs don’t stand a chance when auticon consultants apply their unique visual and analytical abilities.

From fundamental functional testing skills to complex equivalence partitioning and boundary value analysis, our team is trained by instructors with over 20 years of experience and prepared for a variety of software testing projects.

Our hybrid onsite and offsite teams utilize the cognitive benefits of autism to provide the sustained concentration and analytical mindset that makes us wired for success.

How can we help?

Our Quality Assurance & Testing services

We cover all phases of the Software Testing Life Cycle.

Requirement Analysis

Acquisition of domain knowledge and identification of testable requirements through collaboration with respective stakeholders. Creation of test-traceability matrix for determining sufficient test coverage and suitable methodology such as functional or non-functional as well as determining feasibility of test automation.


Test Planning

Deciding on the test strategy for various types of testing and most efficient testing methodologies, including defect management, configuration management, risk management etc. in close collaboration with product owners and business analysts. Estimating the required test effort, determining roles and responsibilities and planning the use of resources. Evaluation and identification of appropriate testing and defect tracking tools, including planning of potential training in these tools.


Test Development and Test Environment Setup

Development of test suites with specific test cases and their respective test scripts based on identified test scenarios/user stories, in accordance with the created test plan. Setup of test environment/configuration management, including preparation and creation of test data and manual/automation test scripts. Creation, verification, adaption and approval of test cases.


Test Execution

Execution of test cases and test scripts according to test plan. Capture, review and analyze the test results. Logging, tracking, retesting and closure of defects as these are addressed by the development team.

More specifically we also cover:

Test Management and Requirements Analysis

  • Testing Consulting Strategy
    Software testing consulting services in the areas of test strategy, methodology, and process assessment across the testing life cycle. Our aim is to support the needs of the business with focus on IT alignment.
  • Test Management
    Testing management for the project including test suite audit, creation, and updates, defect lifecycle management, Agile/SCRUM management, and UAT best practices.

Testing Methodologies

Test Automation

Automation test execution along with automation development, from website and application programming interface automation to robust custom framework implementation.

Manual Testing

Usually black-box testing of the AUT (Application Under Test) to ensure requirements are met.

  • Unit Testing
    Testing of individual parts of the software such as modules, classes and functions.
  • Integration Testing
    After unit testing has been successful we test how different modules work when integrated together.
  • System Testing
    The system is tested on the actual hardware involving all interdependent hardware and software components.
  • Acceptance Testing
    When system testing has been successfully completed we ensure that product requirements have been met and that end-users have tested the system to make sure that the software operates as expected.
  • Smoke Testing
    Preliminary testing to reveal simple failures severe enough for features and functions to be rejected.
  • Sanity Testing
    Brief run-through of major functionalities ta assure that certain parts of the software roughly does what it is expected to do.
  • Regression Testing
    Retesting the system to ensure that changes made to the code does not affect existing system functions.
  • End-to-End Testing
    The entire application is tested in a real-world scenario using intuitive and user story driven actions with production like data to simulate real-time settings with the goal of ensuring that all components of the system such as database, network, hardware, UI, other applications etc. work together as intended.
  • Accessibility Testing
    Our team uses the latest Web Content Accessibility Guidelines to perform web site audits and identify issues or potential concerns for those with disabilities. Our team leverages widely used tools such as screen readers, magnifiers, and alternative input devices.
  • Performance Testing
    We measure how the system behaves under expected loads but also check where the breaking point of the system is using a hybrid automated and manual testing approach.
  • Usability Testing
    End-user centered testing approach that ascertains that the software can be used intuitively and with ease. Corner stones of this method are aspects like learnability, efficiency, memorability and errors.
  • Compatibility Testing
    Tests to ensure that the product is compatible with targeted operating systems, hardware configurations, web browsers, mobile devices etc.
  • Localization and Internationalization Testing
    Working with our clients to deliver flawless user experiences in every country and language. We have several employees on staff that are fluent in multiple languages.

Project Staffing Structure

QA Director – The QA Director has the overall responsibility of the initiation, planning, monitoring and closure of the project. This role interfaces directly with Client Management. Our QA Directors each have over 10 years’ experience in IT.

QA Automation Lead – The Automation QA Lead has the primarily responsibility for understanding and documenting Client test processes, creating test cases and transferring this knowledge to the QA Analysts.  They will be responsible for establishing and recommending test framework. This role will interface directly with the Client Development and QA teams, and will serve as the liaison between Client team and MindSpark QA Analysts. Our QA leads have over 5 years of experience.

QA Lead – The QA Lead will be primarily responsible for understanding and documenting Client test processes, creating test cases and transferring this knowledge to the QA Analysts.  They will ensure the quality of the test cases, test results, defect reports, and perform Quality Assurance on QA Analysts on the knowledge transfer process. This role will interface directly with the Client Development and QA teams, and will serve as the liaison between Client team and MindSpark QA Analysts. Our QA leads have over 5 years of experience.

QA Automation Engineer – The QA Engineer will be primarily responsible for delivering automated test cases and framework and setting up the environments needed for the POC automated test cases to be completed. This includes a solid understanding of existing Client test cases and product functionality, understanding the Client automation needs and processes. QA Engineers will transfer knowledge to, and ensure quality of test scripts from, the QA Automation Analysts.  This role will interface directly with the Client Development and/or QA teams, and will serve as the liaison between Client team and MindSpark QA team.

QA Automation Analyst – The QA Automation Analyst main focus will be creating the automation test scripts based on written test cases. They will also participate in manual test case execution to get familiar with existing test cases. They may also assist with collecting, troubleshooting, and reporting of automation test run results. This role will be supervised by the QA Engineer. The QA Automation Analyst has over 2-5 years of experience in QA.

QA Sr. Analyst – The Sr. QA Analyst role will assist the QA Lead with certain Lead responsibilities such as reviewing test results and analyzing defects found. Sr. QA Analysts will also participate in test case execution. The QA Sr. Analyst has over 2-5 years of experience in QA.

QA Analysts – The QA Analysts will assist the QA lead(s) in test case creation and receive knowledge transfer. QA Analysts will be the primarily testers during execution and will provide detailed defect reporting. Our analysts have 1 -5 years of experience in QA.

Tools and Technologies

  • Windows
  • Linux
  • Mac OS
  • iOS
  • Android
  • tvOS
  • Java
  • JavaScript
  • Python
  • C
  • C#
  • Selenium WebDriver
  • Protractor
  • Git
  • Jenkins
  • Bitbucket
  • Bugzilla
  • JIRA
  • Confluence
  • Slack
  • Rally
  • HP ALM (Quality Center)
  • TestRail
  • QMetry
  • Office 365
  • Apache JMeter
  • LoadRunner
  • Wireshark
  • Postman
  • SoapUI

Success Stories

Logo for Deichmann

“The auticon tester absolutely delivered the added value we wanted. His extremely precise approach coupled with his very high quality standards in testing were worth their weight in gold.”

Dr. Thomas Seeger I Head of Software Development, Deichmann SE

The Monterey Bay Aquarium is developing a web-based application to support its world-renowned Seafood Watch program, and they needed a team to help perform software testing on pre-launch enhancements to the software. “The Seafood Watch program assesses the sustainability of fisheries and aquaculture operations by compiling relevant science-based information and evaluating that information against our standards.”

auticon had only a week to review the enhancements and complete their testing. They worked very quickly, creating and executing test case scenarios. auticon is now on the third iteration of testing, including various extensions and change order requests, to help ensure the best possible experience for their scientists.

auticon was able to work under the tight deadline and complete the required software testing, and continues to find defects that the programming team are able to fix. auticon is testing the program as the expanded options become available for their Fisheries and Aquaculture sections of the Assessment Tool, and tracking the defects in Jira.

iRise, the market leader in enterprise visualization software, needed a team to quickly test a website that tracks adoption of their software products.

auticon conducted a 6 week software testing project, learning the requirements of the website and executing a test plan from start to end.

auticon created over 190 test scenarios, covering 90% of the website functionality. They found and logged 30 defects into iRise’s Jira bug tracking system.

ValleyCrest, a 1 billion dollar provider of landscape development and maintenance services, wanted to setup a Facebook-style newsfeed on their corporate Intranet site and they needed a team to test some specific enhancements to the software.

auticon spent 2 weeks understanding the features of the news feed, developing and executing over 30 test scenarios.

auticon identified several critical defects that the programming team was able to resolve before finishing the news feed. The project was completed on time and within budget.

The IRIS program is a federally funded, multi-year grant that creates material for educators teaching students who will eventually work with people with disabilities. The organization had a 10 year old website that was rich with resource content, but it had usability challenges and needed a design update.

Square One initiated a project to redesign the IRIS website and integrate it with WordPress. auticon conducted all of the functional and visual testing as well as all data conversion of web copy from the old to the new site.

auticon has successfully designed and developed multi-layered test cases for the IRIS site and is currently testing the site. The group has also successfully converted over 3800 static web pages from the old site to the new one. The website is scheduled to be completed and launched in early August 2013.