Yesterday (Oct 22nd) about 5:45 PM GMT (10:45 AM PDT) our web hosting provider Amazon (AWS) started experiencing performance and connection issues.
These issues caused a large number of Internet service companies, including PractiTest, to become unavailable.
We were able to regain temporary access to our servers between 11:30 PM GMT and 12:15 AM GMT, but then the system came down once again.
After evaluating all our options we decided that the best way to solve this issue would be to bring up our latest backup of the system and restore it to a completely new server farm in a different Amazon hosting zone.
We performed this operation and we were able to bring full access to all our servers and services at Oct 23, 8:45 AM GMT (1:45 AM PDT).
Still, the last full backup that was available to perform this operation was from yesterday at 12:00 noon GMT. This means that all information entered between 12:00 noon GMT and 6:45 GMT as well as what was entered for the 45 minutes the servers were up later in the day was not restored.
As a serious web service provider we understand that access to project data is your highest priority and our top-most responsibility.
We are still working to try to bring back the information that we were not able to restore, and we will work with the accounts affected to make it available to them as soon as possible in a separate server.
In parallel we are working on understanding what measures we can implement to lower the risk of issues like this happening once again in the future.
We apologize for any troubles this issue may have caused and will be happy to answer any questions you may have.
* More on this Amazon AWS outage:
We are pretty excited about the announcement released a couple of days ago around the successful collaboration between SAS, Redhat and PractiTest, as you can read from the official press release.
The announcement comes to communicate the vast success SAS has been having managing the testing and Quality Assurance aspects of their deployment projects in the UK.
As was described by James Ochiai-Brown, SAS Senior Solution Architect: “PractiTest gives us the ability to manage our testing in a structured way, and more importantly, demonstrate to our clients the quality assurance that they demand…”
This PR also comes to emphasize the value gained by the relationship between PractiTest and Redhat, as part of the Redhat Innovate program, to which PractiTest was accepted late last year.
If you have questions about any part of this Press Release feel free to contact us.
Last night we released the latest PractiTest update that included a number of features aimed at improving the reporting and Testing Intelligence capabilities of the system
- Dashboard graphs and tables in reports. This was a request that came from a number of users who wanted to add graphical elements to their reports. The feature allows users to add any graph (pie-chart, bar-chart, or progress graph) or any distribution table from the dashboard to any report in PractiTest.
- Select what parts to include in your Detailed reports. Another field request that asked to control the sections included as part of your detailed reports. Starting today you can select whether to show (or not) sections such as comments, history, etc as part of your reports.
- Control the colors of your dashboard graphs. A simple yet powerful feature that let’s you choose what colors to display for each category in your graphs.
- Choose where to display graphs’ legends. An additional field request to define where to place the legend describing the graphs. You can now choose if you want to display this legend inside the graph, at the bottom of the graph, or not to display the legend at all.
- Graph entity preview. This one is the team’s favorite 🙂 – a simple preview of the dashboard items within the settings, to show you how the graph will look based on the definitions you just set or modified.
Clearly we are interested in getting your feedback on these or any other features of the system. Send us your requests to our user forum, and your comments to our support (support-at-practitest-dot-com).
See you all soon with another PractiTest update!
* Also for this upgrade there was no downtime.
We all see it happening and a lot of us are part of it – the distributed testing (and development) phenomenon – a project tested by engineers in different locations either locally – working from home, or the office, on different shifts and in different departments, or internationally – anywhere in the world, and in a range of different time zones.
QA managers and testers know what a coordination and management nightmare distributed testing scenarios can be. If everyone isn’t on the same page, not only do testing teams miss deadlines but they don’t catch and properly follow up on all issues and bugs, and this leads to… well, we all have a story or two of where this leads.
Although distributed testing has been around for years, there are teams out there that still use Excel to manage their testing procedures. Now, don’t get us wrong. We think Excel is great. We use it too, to keep track of our departments’ budgets, for some of our scheduling and to import/export testing-related files. These days, Excel can even be accessed via the Web and mobile and it does enable a degree of collaboration. But, for managing testing, the most efficient – and dare we say, smartest – method is to use a dedicated testing solution, created by testers for testers and the way they need to work.
As lots of you already know, PractiTest is a great testing solution. It allows managers to manage and gain control of their processes in the way that makes sense for the particular project, taking into account the departments, assignments, and people involved. There are so many features that make it a thorough and dependable testing management solution but here we’ll mention just a few – those more relevant for distributed testing.
- PractiTest facilitates the communication that is crucial to successful distributed testing – by keeping everyone up to date with tasks, issues, bugs, statuses, etc. and making the work routine clear to all. And, because it’s cloud-based, no matter where in the world testers are doing their job, the system is fast and responsive.
- QA managers simply set up and organize test runs, assigning them to testers – wherever they may be. Each step performed reports independent results including actual outcomes of the test, so the status of who’s working on what, and where it stands is clear to all.
- Issue workflow is managed in a single system including work on bugs, enhancement requests and other tasks. For example, managers can define the workflow for bugs, specifying which groups may perform the different transitions between them so that rejecting issues can be limited to team leaders, or permiting only testers to transition a bug from Fixed to Closed, etc.
- Duplicate bugs are prevented from being submitted – a potential scenario when many testers are not necessarily in direct contact. The system scans the database for similar descriptions to the bug currently being entered into the system.
- Managers can set up email notifications for alerts on work on issues of particular importance.
- And more…
We encourage you to find out more about PractiTest so that you too can make distributed testing manageable and successful.
As you may have noticed, earlier today we modified our Website.
Basically we wanted our users and potential customers to understand why our Intelligent testing platform improves the QA planning, design and execution processes; ensuring product status transparency and reducing time to market uncertainty.
As part this update we also introduced our new logo, that includes a drawing of our mascot, the testing fox.
Why did we add a fox to our logo?
For many of us (testers) this is a trivial question, but still to make sure we don’t leave any doubts out there, the answer lies in the similarities between the “personal characteristics” of a fox and those of a good tester.
Think about a fox for a second, how he behaves in nature and what traits help him survive and even exceed in his surroundings.
You can say that in order to survive in the wild a fox he needs to be:
And the list goes on and on…
Now let’s look at the Tester in his natural habitat:
SMART – OK, who’d ever say that his job allows even “dumb” people to succeed… But in the case of testing we know that due to the level of complexity required of us, as well as the multitasking and context-switching that define our day-to-day tasks testers need to be smart enough to quickly grasp the challenges ahead of them and successfully approach them in the most intelligent way – without wasting time over-analyzing the situation.
CRAFTY-ness in testers refers to the way we are required to make use of the tools we have at hand, and at times even to come up with new tools to complete our jobs.
CREATIVE-ness allows us to look at an issue and come up with interesting and new approaches to understand and solve them. Sometimes it is how to test a component, other times it can be how to find the complete scenario of a difficult bug, regardless of how you see it, a tester needs to be able to find the solutions by reframing or zooming out from their tasks.
SNEAKY – as the saying goes, it takes a thief to catch a thief! And so, a tester needs to have a sense of sneakiness when he sets out to hunt for bugs. One of the tips I give beginning testers is to go and check the bug database for defects that were detected and fixed in past versions. Understanding the way bugs sneak into the system is always a good way of catching new bugs that used “old ways” to sneak in.
ADAPTABILITY is also something that allows a good tester to juggle between the testing tasks, bug verification tasks, “trouble-shooting with support” tasks, feedback to developer tasks, and all the other tasks that fill out our daily task list.
Finally INGENUITY or cleverness is what allows us to keep coming up with answers and different approaches to the challenges faced everyday.
So, why did we choose a fox? Because a tester needs to be like the fox that works in small groups and even though he’s not the strongest, or the fastest in the forest, he is still able to catch his prey successfully and elegantly.
“We were looking for a solution to organize our testing process and keep control of our quality. PractiTest supplied what we needed, with its organized and structured test case management solution.”
Gilad Breslawer, QA Team Leader, Plimus
“PractiTest enables us to organize our tests and report bugs into one database, in a unified and pre-defined mode, providing us with much-needed order and enabling us to streamline our testing process.
As a service provider working in complete transparency with our customers, we found the PractiTest dashboard and reporting mechanism a superior tool for providing our customers with a clear and precise view of development status. In this way PractiTest helps us reach our goal of maximal customer satisfaction.”
Hila Vax, QA Team leader, Symcotech
“With PractiTest we manage testing of three separate applications (ERP, point-of-sale, and payment processing). This means that separate QA teams in different sites use PractiTest to manage tests and bugs.
No doubt, this is the best product in its price category: PractiTest is solid enough to support multiple testing environments and complex scenarios, yet it’s very intuitive for everyday use (such as filling in bug information and opening new projects). Their support is second to none; a mail is normally replied within the hour; sometimes directly by phone.”
Ophir Amitai, QA Manager, Delek Corp.
“With PractiTest we finally connected Marketing, R&D and QA in one system, streamlining our whole product – development process – from requirements to testing. The results: shorter development and testing cycles, and a product that is closer to market demands.”
Uriel Perlman, Head of QA, Wavion