All 2012 archive posts

Last updated: December 31, 2012

A quick guide to Agile testing in PractiTest

Date: 2012-02-01

More and more teams are switching to Agile development, in order to increase their efficiency and meet the competitive requirements of the field and their users.

Agile software development is very different from more traditional forms of development. As such, it – only logical that adapting an Agile development methodology will require a change in the testing process.

Managing your Agile testing process using PractiTest

Agile testing is a team effort, and therefore requires high-quality communication between team members. Since the testing process may seem less “organized”, it – very important that the relevant information is available to all parties involved. You should always know what should be tested, who – testing what and where everything stands.

Here at PractiTest we are ardent believers of agile development and testing, and we try to design our test management software accordingly. Many of our features can contribute significantly to your agile testing process (as well as make your life a lot easier regardless of your testing methodology). Using Traceabiliy between entities, dynamic views instead of rigid folders, the flexibility of our customization settings, and the graphical information displayed in the Dashboard – your testing process can be more effective than ever before.

User stories

In more traditional testing methods, you would use the Requirements module to define how your system under test (SUT) should work, and what should be tested. In an Agile testing process, you can replace the traditional requirements with user stories – short and precise descriptions of your end users’ needs. You can then organize your User Stories using Custom Views.
userstories

Sharing tests in the test library

In PractiTest you can write and manage your Acceptance Tests (i.e. – tests designed to ensure that your requirements are met) within the Test Library, linking them back to the User Story where they originated.

You can then use the history, comments and notifications features to allow everyone to add their inputs into these tests, and to be informed about any changes made by other users.

You can simply create a test for each User Story, where developers can provide their inputs to testers as they come up with ideas during the design or coding process.

Creating tests sets for each user story

We recommend creating Test Sets for each User Story Independently. These test sets can contain the acceptance tests, functional tests, and any other testing operations needed for a specific User Story. This way you can get a better sense of coverage and completion for each User Story. It is also recommended to use the Tractability function to link between tests and their user stories.

3

Grouping issues based on their target sprint and user story

When you report issues, use custom fields to assign them the sprint in which they should be solved.

Also, in order to have tractability between issues, tests and requirements, you should link your tests to their relevant user stories. You can report issues directly from your test runs (using “fail and issue”), or link the issues back to the tests they originated from, for full tractability.

Using views to organize your issues based on Sprints, Users Stories, Modules, etc.

A good practice is to use the Issues Module not only to report bugs, but also to manage all the tasks of your User Stories and Sprint. Create tasks to keep track of the activities of your project and their individual statuses.

Provide visibility using a Summary Dashboard and additional Dashboards per User Story

You can use the Dashboard to keep your team up to date with the status of the Sprint in general, and of each User Stories in particular.

With the help of the views you have in each of your modules, create one dashboard centralizing all the information for your Sprint, and then create additional dashboard tabs with information for each User Story independently.

PractiTest joins Red Hat Innovate!

Date: 2012-03-01

redhat-logo-big2We are happy to announce that PractiTest has been selected to participate in Red Hat Innovate.

Red Hat Innovate is a new initiative launched by Red Hat, the world’s leading Open Source Software Company. The initiative aims to assist innovative software development start-ups to build on the power of the open source community.

Red Hat – vote of confidence means a lot to us, and we are very excited to be offered this partnership. This opportunity will help us keep doing what we do best – developing and innovating in the field of software testing.

Improved instances grid, multi level linked list and more in PractiTest’s latest release

Date: 2012-04-01

We open this month with the latest version of PractiTest, released on april 1st. This version includes new features and updates, most of them were requested by our customers.

Here are the main changes made in the latest version:

  • Our new and improved Test Instances grid: You can now select to see any field taken from the original test (in the Test Library) as part of your test instances grid (within the Test Set). So you can now filter the instances in the grid based on the Test fields (just click on the Columns button)!
    0
  • Multi-level linked-lists: in previous versions, you were able to create linked-list field, whose values depend upon a previously created list field, so that your linked lists were limited to 2 levels only (parent-child). Now you no longer have to settle for a 2-level linked list, and you can create linked lists with as many levels as you need.
    For example, you can create a regular list field to denote the OS you are working on, then a linked list of Browsers showing you only the browsers available for each OS, and then a second linked list of Browser Versions showing you only versions relevant for the selected browser.
    1
  • New notification list: users defined as Administrators can now add other users to an entity – notification list, to make sure their users receive messages when changes are made the entity
    2
  • Improved dashboard pie-charts, with easy to read labels located outside the different sections.
    31

We hope you enjoy and benefit from these new additions, and as always, we are waiting for your feature request if you have further suggestions or comments, don’t hesitate to let us know!

Plimus

Date: 2012-06-01

“We were looking for a solution to organize our testing process and keep control of our quality. PractiTest supplied what we needed, with its organized and structured test case management solution.”

Gilad Breslawer, QA Team Leader, Plimus

Symcotech

Date: 2012-06-01

“PractiTest enables us to organize our tests and report bugs into one database, in a unified and pre-defined mode, providing us with much-needed order and enabling us to streamline our testing process.
As a service provider working in complete transparency with our customers, we found the PractiTest dashboard and reporting mechanism a superior tool for providing our customers with a clear and precise view of development status. In this way PractiTest helps us reach our goal of maximal customer satisfaction.”

Hila Vax, QA Team leader, Symcotech

Delek Corp.

Date: 2012-06-01

“With PractiTest we manage testing of three separate applications (ERP, point-of-sale, and payment processing). This means that separate QA teams in different sites use PractiTest to manage tests and bugs.
No doubt, this is the best product in its price category: PractiTest is solid enough to support multiple testing environments and complex scenarios, yet it’s very intuitive for everyday use (such as filling in bug information and opening new projects). Their support is second to none; a mail is normally replied within the hour; sometimes directly by phone.”

Ophir Amitai, QA Manager, Delek Corp.

Wavion

Date: 2012-06-01

“With PractiTest we finally connected Marketing, R&D and QA in one system, streamlining our whole product – development process – from requirements to testing. The results: shorter development and testing cycles, and a product that is closer to market demands.”

Uriel Perlman, Head of QA, Wavion

How to Best Manage Distributed Testing

Date: 2012-07-01

We all see it happening and a lot of us are part of it – the distributed testing (and development) phenomenon – a project tested by engineers in different locations either locally – working from home, or the office, on different shifts and in different departments, or internationally – anywhere in the world, and in a range of different time zones.

QA managers and testers know what a coordination and management nightmare distributed testing scenarios can be. If everyone isn’t on the same page, not only do testing teams miss deadlines but they don’t catch and properly follow up on all issues and bugs, and this leads to… well, we all have a story or two of where this leads.

Although distributed testing has been around for years, there are teams out there that still use Excel to manage their testing procedures. Now, don’t get us wrong. We think Excel is great. We use it too, to keep track of our departments’ budgets, for some of our scheduling and to import/export testing-related files. These days, Excel can even be accessed via the Web and mobile and it does enable a degree of collaboration. But, for managing testing, the most efficient – and dare we say, smartest – method is to use a dedicated testing solution, created by testers for testers and the way they need to work.

As lots of you already know, PractiTest is a great testing solution. It allows managers to manage and gain control of their processes in the way that makes sense for the particular project, taking into account the departments, assignments, and people involved. There are so many features that make it a thorough and dependable testing management solution but here we’ll mention just a few – those more relevant for distributed testing.

  • PractiTest facilitates the communication that is crucial to successful distributed testing – by keeping everyone up to date with tasks, issues, bugs, statuses, etc. and making the work routine clear to all. And, because it’s cloud-based, no matter where in the world testers are doing their job, the system is fast and responsive.
    TestInstances
  • QA managers simply set up and organize test runs, assigning them to testers – wherever they may be. Each step performed reports independent results including actual outcomes of the test, so the status of who’s working on what, and where it stands is clear to all.
  • Issue workflow is managed in a single system including work on bugs, enhancement requests and other tasks. For example, managers can define the workflow for bugs, specifying which groups may perform the different transitions between them so that rejecting issues can be limited to team leaders, or permiting only testers to transition a bug from Fixed to Closed, etc.
  • Duplicate bugs are prevented from being submitted – a potential scenario when many testers are not necessarily in direct contact. The system scans the database for similar descriptions to the bug currently being entered into the system.
    SimilarIssues
  • Managers can set up email notifications for alerts on work on issues of particular importance.
  • And more…

We encourage you to find out more about PractiTest so that you too can make distributed testing manageable and successful.

Introducing our new PractiTest logo

Date: 2012-07-01

As you may have noticed, earlier today we modified our Website.

PractiTest-Logo-small-300x68Basically we wanted our users and potential customers to understand why our Intelligent testing platform improves the QA planning, design and execution processes; ensuring product status transparency and reducing time to market uncertainty.

As part this update we also introduced our new logo, that includes a drawing of our mascot, the testing fox.

Why did we add a fox to our logo?

For many of us (testers) this is a trivial question, but still to make sure we don’t leave any doubts out there, the answer lies in the similarities between the “personal characteristics” of a fox and those of a good tester.

Think about a fox for a second, how he behaves in nature and what traits help him survive and even exceed in his surroundings.

You can say that in order to survive in the wild a fox he needs to be:
– Smart
– Crafty
– Creative
– Sneaky
– Adaptable
– Ingenious
And the list goes on and on…

Now let’s look at the Tester in his natural habitat:

SMART – OK, who’d ever say that his job allows even “dumb” people to succeed… But in the case of testing we know that due to the level of complexity required of us, as well as the multitasking and context-switching that define our day-to-day tasks testers need to be smart enough to quickly grasp the challenges ahead of them and successfully approach them in the most intelligent way – without wasting time over-analyzing the situation.

CRAFTY-ness in testers refers to the way we are required to make use of the tools we have at hand, and at times even to come up with new tools to complete our jobs.

CREATIVE-ness allows us to look at an issue and come up with interesting and new approaches to understand and solve them. Sometimes it is how to test a component, other times it can be how to find the complete scenario of a difficult bug, regardless of how you see it, a tester needs to be able to find the solutions by reframing or zooming out from their tasks.

SNEAKY – as the saying goes, it takes a thief to catch a thief! And so, a tester needs to have a sense of sneakiness when he sets out to hunt for bugs. One of the tips I give beginning testers is to go and check the bug database for defects that were detected and fixed in past versions. Understanding the way bugs sneak into the system is always a good way of catching new bugs that used “old ways” to sneak in.

ADAPTABILITY is also something that allows a good tester to juggle between the testing tasks, bug verification tasks, “trouble-shooting with support” tasks, feedback to developer tasks, and all the other tasks that fill out our daily task list.

Finally INGENUITY or cleverness is what allows us to keep coming up with answers and different approaches to the challenges faced everyday.

So, why did we choose a fox? Because a tester needs to be like the fox that works in small groups and even though he’s not the strongest, or the fastest in the forest, he is still able to catch his prey successfully and elegantly.

Improved Reports, advanced options on Dashboard items and more

Date: 2012-08-01

Last night we released the latest PractiTest update that included a number of features aimed at improving the reporting and Testing Intelligence capabilities of the system

  • Dashboard graphs and tables in reports. This was a request that came from a number of users who wanted to add graphical elements to their reports. The feature allows users to add any graph (pie-chart, bar-chart, or progress graph) or any distribution table from the dashboard to any report in PractiTest.
    graphs-in-reports
  • Select what parts to include in your Detailed reports. Another field request that asked to control the sections included as part of your detailed reports. Starting today you can select whether to show (or not) sections such as comments, history, etc as part of your reports.
  • Control the colors of your dashboard graphs. A simple yet powerful feature that let’s you choose what colors to display for each category in your graphs.
  • Choose where to display graphs’ legends. An additional field request to define where to place the legend describing the graphs. You can now choose if you want to display this legend inside the graph, at the bottom of the graph, or not to display the legend at all.
  • Graph entity preview. This one is the team’s favorite 🙂 – a simple preview of the dashboard items within the settings, to show you how the graph will look based on the definitions you just set or modified.
    dashboard-improvements

Clearly we are interested in getting your feedback on these or any other features of the system. Send us your requests to our user forum, and your comments to our support (support-at-practitest-dot-com).

See you all soon with another PractiTest update!

* Also for this upgrade there was no downtime.

SAS, Redhat and PractiTest Collaborate Around Next Generation Testing and QA Management

Date: 2012-10-01

PT_RH_SASWe are pretty excited about the announcement released a couple of days ago around the successful collaboration between SAS, Redhat and PractiTest, as you can read from the official press release.

The announcement comes to communicate the vast success SAS has been having managing the testing and Quality Assurance aspects of their deployment projects in the UK.

As was described by James Ochiai-Brown, SAS Senior Solution Architect: “PractiTest gives us the ability to manage our testing in a structured way, and more importantly, demonstrate to our clients the quality assurance that they demand…”

This PR also comes to emphasize the value gained by the relationship between PractiTest and Redhat, as part of the Redhat Innovate program, to which PractiTest was accepted late last year.

If you have questions about any part of this Press Release feel free to contact us.

Server issues and backup restore

Date: 2012-10-23

Yesterday (Oct 22nd) about 5:45 PM GMT (10:45 AM PDT) our web hosting provider Amazon (AWS) started experiencing performance and connection issues.
These issues caused a large number of Internet service companies, including PractiTest, to become unavailable.

We were able to regain temporary access to our servers between 11:30 PM GMT and 12:15 AM GMT, but then the system came down once again.

After evaluating all our options we decided that the best way to solve this issue would be to bring up our latest backup of the system and restore it to a completely new server farm in a different Amazon hosting zone.
We performed this operation and we were able to bring full access to all our servers and services at Oct 23, 8:45 AM GMT (1:45 AM PDT).

Still, the last full backup that was available to perform this operation was from yesterday at 12:00 noon GMT. This means that all information entered between 12:00 noon GMT and 6:45 GMT as well as what was entered for the 45 minutes the servers were up later in the day was not restored.

As a serious web service provider we understand that access to project data is your highest priority and our top-most responsibility.
We are still working to try to bring back the information that we were not able to restore, and we will work with the accounts affected to make it available to them as soon as possible in a separate server.

In parallel we are working on understanding what measures we can implement to lower the risk of issues like this happening once again in the future.

We apologize for any troubles this issue may have caused and will be happy to answer any questions you may have.

More about this Amazon AWS outage.

Improved Filters, Enhanced Attachments and more

Date: 2012-11-01

Yesterday night we released a new version of PractiTest. This update comes with a number of features and improvements.

  • Improved filters. New GUI that makes it easier to manage your filters, as well as the ability to see the filter criteria for the whole filter hierarchy. In addition to this, the filter pane now loads a lot faster than before.
    filter_capture
  • Enhanced attachments. Graphical attachments now have a thumbnail that allows you to preview the image before opening it.
  • Additional functionality to issue integrations. Displaying bug IDs in reports.
  • And many other enhancements and features.

As always, we want to hear more from you about the stuff you’d like to see if include or improve in PractiTest. Feel free to contact us via our support team.

See you all soon in another PractiTest update!

Shift your testing Forward