TestCaddy is a software testing tool that allows you to create Test Sets for many different clients, projects or software programs. Test Sets are readily edited or updated, easily accessed for regression testing and TestCaddy maintains a full history of testing cycles and results. TestCaddy allows easy monitoring of progress through powerful reporting tools.
Many organisations continue to use spreadsheets for software testing. These rapidly become confusing and sometimes unreadable, particularly when more than one person needs to access the data. TestCaddy is a tool that will save time and money, whether your testing is done by developers or by a dedicated testing team.
TestCaddy is essentially a database of Test Sets written for each customer and product to be tested, which can then be used for regression testing. A full history of testing is retained and at any time you can review and report on progress with your testing project, Tests can be updated or new Tests added at any time. TestCaddy is used in conjunction with a bug-tracker such as Bugzilla, or Jira. The defect associated with a failed Test can be logged in your bug-tracking tool of choice. TestCaddy has close integration with Jira.
System requirements are minimal. Any computer running Windows is generally sufficient. For full details refer to Installation Details.
Follow the step by step instructions
on our Download
page.
The TestCaddy installer will install any missing pre-requisites. Alternatively they can be installed manually (links available on the Download page) prior to running the TestCaddy installer. Once the TestCaddy installer has finished, double-click on the TestCaddy icon on your desktop.
The Configuration Wizard will guide you through the installation process. You have the option of installing SQL server or configuring the current computer to connect to a specific SQL server. Select the option to install SQL Express if you want TestCaddy to stand alone from the rest of the IT infrastructure in your company. An IT Administrator can always move your TestCaddy database to a company SQL server at a later time. (For further information on which option to select for your situation see FAQs (Frequently Asked Questions) on our webpage.)
If connecting to an existing SQL server, you will need to know:
If this is the first time TestCaddy has been run against this SQL server, the Wizard will create the first product database, and add demonstration data to this database for you. At the end of the Wizard you will be asked to create yourself a new login or select an existing user. The main TestCaddy window then opens.
Note: For more
detailed instructions on
download and
installation refer
to Installation
Details.
The first time you run
TestCaddy a Configuration Wizard will assist you
to connect to the SQL Server you wish to use, or to install SQL Express
on your workstation. The Wizard will also help
you to create and then select a database into which you
can write
Tests - your first product database. At the end of
the Wizard
you will be asked to select an existing user, or to create a new login
for yourself. The main TestCaddy window
will then open.
TestCaddy comes with the option of having some demonstration
data
loaded when
it is first opened. The folders can be renamed, copied,
moved, deleted, or new folders can be added. Similarly Test
Sets,
Tests and History Items can be edited, copied, moved, renamed or
deleted.
We
recommend that when you have explored TestCaddy using the Demo Data you
create a new product database to begin work on your real product
(File>Database Manager).
Not all testers use the same nomenclature and there may be some new concepts to understand when using TestCaddy for the first time. As a guide, some of the more commonly used terms are introduced here.
For convenience, individual Tests are grouped into Test Sets. (Any number of Tests can be included in a Test Set.) These may also be referred to as a test suite or as test scripts.
Includes all the Tests in the product database (i.e. all the Tests written for a product).
Work on a product is usually divided into Projects.
Each Project may be subdivided into Folders. Test Sets are written for each Folder as work proceeds. Some teams may use a Test Set for each "sprint" in agile software development. The Tests used during one phase can easily be copied or edited for use in a later phase.
In TestCaddy all Tests are conveniently grouped according to the Functional Areas of the product (e.g. installation, web page, security features, etc.). This assists with locating Tests, and may also be useful for assigning work to members of the testing team.
Each time a Test is run, TestCaddy stores a record of the date and time of the run, the tester, and the status (Passed, Failed, or other options).
TestCaddy has, by default, one main display window on which
there
are three
panes.
The upper left pane is called the Projects
Panel.
This opens with a treeview of Project Folders, and sub-folders (phase
folders), which can be opened to show the Test Sets you have
created.
The lower left pane is called the Active
Set Panel.
This opens to display the detail of the Test Set selected
in the Projects Panel above. Again a treeview is used and all the Tests
in
the set are grouped into Folders (Functional Area Folders) reflecting
the functional areas of the product. Below each Test are sub-items
which record the history of each run of the Test.
The third pane on the right side of the TestCaddy window is
called the Work Area.
This displays information on a particular Test selected in the Active
Set Panel.
The Test steps grid in the Work Area displays all the steps within a
Test.
When a step is selected the details appear in the fields below:
Note: Most fields in the Work Area are only enabled when either editing or running a Test.
Tip: Some actions are provided as buttons across the top of the page, but most functionality is reached by "right click' menu items or through keyboard shortcuts.
TestCaddy runs in a number of modes:
This is the normal mode. Click on the Project Test Sets button to return to this mode. In this mode you can view the Tests in Test Sets and their Folders. Limited editing is allowed in this mode. For example, you can create a new Test, Test Set, or Project Folder.
This is the only mode where major organisational changes can be made, e.g. creating and moving Functional Areas, and reordering Tests. Click on the Master Test Library button in the upper left toolbar and all Tests in the product database are displayed in the Test Set area.
This is the mode used for editing or selecting which Tests are in a particular Test Set. Select a Test Set in the Projects Panel and then click on the Select Tests button found at the top of the Active Set Panel. All Tests in the master Test Set are displayed, and the Test steps are shown when a Test is selected for easy recognition of the correct Test.
Note: Some functions of the Master Test Library mode (e.g. create new Test) are also available.
This mode is used for writing a Test, or adding or editing steps to a Test. To enter this mode, select a Test and then click on Edit Test. To exit this mode, use the Save or Cancel buttons to return to Browse mode.
This mode is used to run Tests and to record their status. TestCaddy enters this mode when you click on Run Test (or Continue Test and similar). The Expected Result Field is shown and you can set the status of individual Test steps or the entire Test. The Save and Cancel buttons let you exit this mode.
This mode also has its own saveable layout, so that when running Tests the TestCaddy window moves out of the way of the application being tested.
Note: Test steps can be edited in this mode.
To help you better understand TestCaddy modes, here are some quick how to answers:
How do I ...
Click on the Project Test Sets button in the upper left toolbar, then expand the Folder(s) related to your Project. Click on a Test Set and the Tests in that set will be shown in the Active (Test) Set Panel (the lower treeview). In the treeview you can also see which Tests are assigned to which tester. You can use the filter button to limit the visible Tests to those assigned to yourself.
Click on the Master Test Library button in the upper left toolbar to enter Master Test Library mode. Then in the lower panel, right click on items in the tree to see available actions.
Click on the Project Test Sets button, select the Test Set to which you wish to add, then click on the Select Tests button in the toolbar that splits the Projects Panel and the Active (Test) Set Panel.
Click on Project Test Sets, select Test Set (add the Test if it is not already in the set), then click on Run Test in the upper right toolbar.
The Master Test Library can be viewed by clicking on Master
Test Library in the Projects Panel toolbar.
Whenever a Test is created it is automatically added to the Master Test
Library (for the current product database). If the Master Test Library
button
is clicked, all the Tests in the current database are shown in the
lower treeview, and they are shown grouped according to Functional
Area.
This is the place to organise your Tests, i.e. to make new Functional
Areas, move Tests between Functional Areas, create new Tests and order
Tests.
Note: These changes affect all Test Sets and all team members, so they are deliberately a little harder to find in the product and TestCaddy will remind you of their significance.
Each Test created has a numeric test label and also a unique identifier (UID): both are displayed under the Test Properties tab in the Work Area when the Test is selected.
Tip: The test label should be used when discussing a particular Test.
The first part of the numeric test label refers to the Functional Area Folder where the Test resides, so if a Test is moved within a Folder it retains its label. However, if it is moved into another Functional Area Folder it will receive a different numeric label. (A warning box alerts the user that this will occur, since the changes affect all users.) If a Test is copied, the copy will automatically receive a new UID and a new numeric label.
Note: Numbering of Tests may not appear to be consecutive. Tests may have been moved. (The order in which Tests appear can be changed in the Master Test Library.) The numeric labels assigned to Tests which have been deleted, are not utilised again.
When a Test has been run, the history shows as a sub-item in the Active Set Panel. An overall status is indicated for the run, and the status of the most recent run is displayed in icon form for the Test itself.
If the conditions for neither Passed or Failed are met:
A free text field. This should describe the actions the tester should carry out for that particular Test step.
Tip: All Tests are saved into the Master Test Library. If you create a Test in a Project Test Set (as in the example above) then TestCaddy automatically creates the Test in the Master Test Library and then adds the Test into the Project Test Set you are viewing. (View the Master Test Library by clicking on the Master Test Library button in the upper left toolbar).
To alter steps in a Test:
Tip: Depending on the way your team works, grouping Tests into Test Sets may be done for a number of reasons. With other tools teams often use a Test Set to break a Project into groups of Tests that Test the same functional area of a product. In TestCaddy all Test Sets are already grouped by Functional Area to achieve this goal. This allows Test Sets to be used for other common reasons such as:
Sometimes Test Sets are simply used to allow easy assignment of work to team members.
The set will be created, ready to be edited. TestCaddy
automatically
prefixes the name with a unique identifier for the Folder, giving a
handy way to refer to a Folder when talking to
colleagues.
Tip: You can click a Functional Area Folder to select or de-select all Tests in that Folder and its sub-folders.
You can remove Tests from a Test Set but caution is advised, as it may be better to mark a Test as Not Applicable.Note: The
result is shown as the most
recent History Item under the selected Test in the Active Set Panel.
The current overall status of the Test is
also determined from your most recent run.
Note: Because Test steps can be edited while running a Test, at the end of the run TestCaddy will ask you if you want to save the changes only for that run of the Test (History Item), or to change the Test in the Master Test Library so that all future runs of that Test from any Test Set will use the changes you have made.
When a Test step has been run the tester selects an outcome from the Drop Down list in the status box displayed at the bottom of the Work Area. Test Caddy marks all steps initially as Not Run, and a Test with all steps in this state as not yet run. Choices are then:
Note: Shortcuts speed up this process and several steps may be selected at once.
During the running of a Test, the product version and build number are displayed in the Work Area. The product version number defaults to the most recently used version, but also has a Drop Down list of all current versions.
As new product versions become available for testing they
should be
entered Configuration>Versions
List. A version number and a
brief description may be entered, and versions can be made inactive
once they are outdated. (Note:
When a History Item is being edited
inactive versions are still available.) Similarly, the build number
defaults to that most recently used, but a Drop Down list offers all
available builds, or the option of entering a
new build number for the currently selected version.
The Environment and General Custom Fields can be recorded during a Test run. (Similarly they default to the most recent selections by the tester.)
When a run is saved, a confirmation dialog box asks you to confirm default options are correct if the tester has not changed each of the values displayed for Environment and General Custom Fields, version and build. If you do not wish to record these for every run of a Test, carry out the following:
TestCaddy will no longer require environment, version and build before saving a run.
If only some steps of the Test are completed, these can be saved, and the remainder left as Not Run. The Test history will show that the Test is still In Progress (or Failed if one or more steps were failed). The Continue Run item on the toolbar at the top of the Work Area displays the Test at the point it was saved, and the run may be continued.
Sometimes it is necessary to correct the information recorded when a Test was run. Simply select the History Item and click on Edit History in the upper toolbar of the Work Area. The Actual Result Field may be edited, and it is also possible to change the environment, version or build information.
TestCaddy's Custom Fields are split into two categories; Environment and General Custom Fields. Environment Custom Fields let you set, for example, the Operating System or SQL Server Version, and are color coded green throughout TestCaddy for ease of use. All other Custom Fields should be placed in the General Custom Field category, which is color coded blue. An example of a General Custom Field category might be Reviewed Status.
Both categories of Custom Fields can be managed through the
Custom
Fields Manager dialog, where you can create new Custom Fields and enter
and edit lists of values available for each field (see Section 3.Y).
Custom Fields can be used with:
TestCaddy allows the user to set their own Environment
(green)
and General (blue) Custom Fields to cater
fully for the testing process. Using Configuration >
Custom
Fields
allows the creation of your own settings for Environment to
clearly
inform the tester which Operating System etc. the Test
requires. It also allows you to set Custom Fields to specify the exact
way you want your Tests to be run.
To create these fields you will first need to create a "Drop Down", which is then accessed by right clicking on a root level Folder, Environment or General Custom Field. The Drop Down is the parent for the options you will have. Right clicking and selecting New Drop Down Items on it will allow you to create the next level of Drop Down Items. For example, if you wish to have your testers set the Operating System when they are running Tests then you would set an environment Drop Down to "OS" and your Drop Down Items underneath it to "Windows 7", "Windows Vista SP2" etc.
Note:
TestCaddy also allows you to create Folders to organise your
Custom Fields in the way you see fit. The levels of Folders
are removed in the Custom Field selection dialogs that the users
see. This is to allow for quick entry.
You
can edit your Custom Fields at any time by accessing the Manage Custom
Fields menu (using Configuration >
Custom
Fields), selecting your Custom Field and clicking the Edit
Custom
Field button on the upper toolbar. This allows you to change the name,
Import Notes and the Custom Field's availability in the Custom Field
selection (green and blue) grids.
Unticking the Available box means that a Custom
Field is no longer intended for use. It will still be visible
if
already selected in a Test but cannot be saved when Editing or Running
a Test. If you Edit History then Not Available options can be saved, so
long as you don't change them. This allows editing of other properties
of a History Item without changing the Custom Field value set at the
time of the run. Custom Fields marked Not Available
can
still be selected in filters and Reports so that you can report on old
Tests.
On the right click menu you are also given the
option to delete a Custom Field. In general this should only
be
used for Custom Fields which have never been used, or when a Custom
Field has not been used for a very long time and there is no interest
in it, or it is cluttering the Custom Field user interfaces.
When you delete a Custom Field, if you have not used it with any Tests (Planning or Run History) then it will be permanently deleted. If you have set this Custom Field in, for example, the Planning tab of a Test then you will only be given the option to mark it as deleted. The Custom Field will be removed from the users view, but the information regarding it will still be stored by TestCaddy to protect data integrity. This means it will act just like a deleted Test. Custom Fields marked as Deleted are NOT listed in filters and Reports: from the user perspective they are gone and there is no record of them having been recorded.
Custom Fields marked as Deleted can have the delete action reversed by clicking 'Show all Custom Fields' (deleted Custom Fields can only be seen if 'Show all Custom Fields' is ticked), then right clicking and selecting 'Undelete'. This restores it to full functionality.
You can also change the Type of the Custom Field if you have ticked the "Allow Type changes". This allows you to change a Folder to a Drop Down etc. This is an advanced feature for use after importing Custom Fields from another product and hence should be used with caution. TestCaddy will notify you by marking a Custom Field as "INVALID TYPE" or "INCONSISTENT TYPE" if your changes result in a Custom Field that is in an illegal position (e.g. a Drop Down Item directly under a Folder).The Planning tab allows you to set which Environment and Custom Fields you want the Test to be run on, separately from the actual test run. The Planning tab can only be edited when editing a Test, not while running a Test. The fields you set on the Planning tab remain unchanged if you set them again when running the Test.
The Planning tab also allows you to write Planning notes to indicate to other testers specific instruction on the running of the Test.
Click on any Test in a Test Set, and use the right click menu to assign a tester for that Test. (Note: This option is not available in the Master Test Library.) Either individual Tests, or all the Tests in a particular Functional Area Folder may be assigned, and will show the name of the tester to whom they have been assigned (while still remaining available to others). A Test is only locked for the time it is actually in use. The right click menu offers three options:
Filters can be created for the Active Set (lower left treeview) and for Reports and Dashboards.
The "Filter" is on the Active Set Panel toolbar. The status of the filter stays constant as you move through TestCaddy's modes and when you exit and return to TestCaddy your last filter is always remembered.
This filter is useful when Selecting Tests. The
specified text filter option can be used to limit Tests in the treeview
to Tests that mention a certain feature, no matter what Functional Area
Folder
they are stored in.
Warning: If you can't see what you are expecting in the Active
Set
Panel treeview then check your current filter and turn it off if
necessary.
You can use the filter when in the Reports section to filter the Tests that your graphs display. This lets you customise your graph sets to show only the Project's progress on a specific area. Edit your selected graph and use the filter as you would if in the Active Set and Save. Your graph will now show only Tests that match your filter when viewed in either Reports or in the Dashboard tab.
For more
information refer to Advanced Reports (Section 7).
You can view only the Tests of interest by enabling the filter and selecting which Tests will be displayed.
In all, there are four different filters that can be used. They allow you to see only;
Simple checkbox ticks allow any combination of the above filters to be used. For example, you can filter to see only Tests that have a status of Passed AND are Assigned to tester 'Bob'.
The Custom Fields section of the Filter Dialog has an SQL Query text box where you can enter SQL filter strings to refine your Custom Field filter.
The Custom Fields SQL Query box does not accept keystrokes, with the exception of the left and right arrow keys, spacebar and the backspace keys. It treats each Custom Field as a "chunk" and has special text highlighting and navigation buttons for these chunks.
The left and right arrow keys, spacebar and the backspace keys allow you to navigate and delete "chunks" of your search query. Using the arrow keys will highlight the next section of your search and pressing backspace will delete the next section to the left. These keys also have their own buttons at the bottom of the dialog as well as "OR", "AND", "Clear" and "Insert Custom Field item(s)".
Example;
To see Tests that have been run with WinXP32 or Win7, and have a Review Status of "Yes":
This would give you the search query of ("OS--WinXP32" AND "Review Status--Yes") OR ("OS--Win7" AND "Review Status--Yes"). You can then choose to have your query filtered as "Contains" or "Matches Exactly", depending on how you want to define your search. You can also choose whether to filter on Planning, Last Test run, or Test Properties (or any combination of the above) by ticking the checkboxes on the dialog.
Note: When
filtering on Custom Fields the filter will sometimes simplify your
search query if it can, to optimise your search.
If you wish to change the current status of a Test, simply select the Test and click on the small arrow to the right of the status (lower right toolbar). A dialog box allows the option of a comment in the Actual Result Field, and a check box turns off the need to record version and environment.
Certain tasks should only need to be
carried out
once and then details communicated to the rest of the team. A Team
Leader should be selected to carry out these tasks to avoid duplication
of effort.
The Team Leader
should create the product database(s). Although these can be created at
any
workstation, it is preferable for the Team Leader to take on this
task. The Team Leader should consider whether to have a) one
product database
for each product, with Folders for each version, or b) one product
database for each major version.
For example, databases for One KoffeeSoftware and one SuperEmailer might be set up as:
a) | KoffeeSoftware Database | or b) | KoffeeSoftware v6.0 Database |
Version 6.0 Folder | KoffeeSoftware v6.1 Database | ||
Version 6.1 Folder | SuperEmailer v3.0 Database | ||
SuperEmailer Database | SuperEmailer v4.0 Database | ||
Version 3.0 Folder | |||
Version 4.0 Folder |
The Team Leader may wish to set up an appropriate login name for each member of the team. New login names can be added at any time, and a login may be deactivated if necessary. Individuals may add their own login at any time.
Tip: We suggest each tester uses the same login name as used for your bug-tracking tool.
Each team member should install TestCaddy on his or her workstation and all should use the same product database(s). Team members will need the computer name (or IP Address) of the SQL server that TestCaddy is using, and the username and password required to login. (The SQL server can be set up for Windows Authentication, if that method is preferred.)
TestCaddy is designed so
that users
should be able to administer it themselves, e.g. through adding
versions and
builds, or adding user logins.
Tip:
If
working on two products at one time,
it is necessary to run two copies of TestCaddy at once, one for each
product.
Work on each product
will probably be divided into Projects (which may be carried out in
several phases), and the Projects may be divided
into Folders. Folders for each Project (and
sub-folders for each phase) are created in the Projects Panel. Within a
Folder you can create a collection of Tests grouped
into Test Sets.
The hierarchy looks like this:
Product Database
Master Test Library
Folder 1
Sub-Folder 1
Test Set 1
Test Set 2
Folder 2
Sub-Folder 1
Test Set 1
Sub-Folder 2
Test
Set 1
Choose what the Functional Areas
will be, then
create them in the Master Test Library. The Test Sets and Tests
are conveniently grouped according to Functional Area in the
Active Set Panel. These might include, for example, installation, web
page,
security
features, etc.
Splitting your Tests into the Functional Area Folders
is recommended but not mandatory. You can group your Tests in whatever
way you desire. Some teams group by testing type, such as functional
tests, regression tests etc. Others may group by functionality or GUI
layout, or by some combination of both groupings. Whatever is chosen,
the Functional Areas and groupings can be edited or added to as work
proceeds.
EXAMPLE: Functional Areas for Koffeesoftware Version 6.0.1:
Install/Uninstall/Upgrade
Reporting and Logging
Web Page
Text Entry, Selection and Deletion
Text Formatting
Security Features
Miscellaneous
Tip:
Another area such as "Non-functional
testing", or a "Miscellaneous"
category, may be added for convenience if some Tests
do not fit
readily into any Functional Area of the product.
If no Functional Areas are set up, all Tests will be listed at the top level in the Master Test Library. All TestCaddy users see the same Functional Areas in the Active Set Panel treeview They are the same for all Test Sets in the product database into which you have logged. Grouping the Tests in the Master Test Library into Functional Areas is helpful as your quantity of Tests increases, particularly when reviewing which areas of your product you have tested to assess testing coverage. Functional Areas help with breaking up or assigning work and even help with simply trying to find a particular Test. We find that Functional Areas are a great way to group Tests from a user centric perspective, helping you to review testing coverage from the customer’s point of view.
As a simple example, if you are
testing a calculator application
you could group Tests EITHER for elements of the screen (e.g. digit
button
tests, operators buttons like plus and minus tests, etc.), OR by
considering the way that a user would think (e.g. by addition,
subtraction etc.). Typically a combination is needed.
Functional Areas can also be created for things like exploratory sessions, where you can easily add and edit Tests as you run them to track your exploratory session.
Much of the functionality of TestCaddy is accessed via the right click menu (or alternatively, by the use of short-cut keys). This allows the rapid building of Test Sets as work progresses. Previously written Test Sets can be cloned, renamed and edited, with some Tests omitted and new Tests added.
Tip: Use placeholders to quickly build a new Test Set. The detail can be added later.
If the Master Test Library is selected the right click menu offers the following options:
If Project Test Sets and an item in the Projects Panel are selected then the right click menu offers the following options:
If Project Test Sets and an item in the Active Set Panel are selected then the right click menu offers all the same functionality as in the Master Test Library.
When a Test is selected in either the Master Test Library or the Projects Panel, the right click menu gives the option to create a related test with shared steps. The Create Related Test option from the drop down menu allows three sorts of related tests to be created: Coupled Tests, Test Set Instances or Linked Tests. The three types provide different options for building on the original test, allowing it to be run across multiple environments, or with slightly different combinations of steps.
Coupled Tests share the same Test Steps, but allow Test Properties and Test Planning differences.
Test Set Instances share the same Test Steps and Test Properties but allow only Test Planning differences.
Linked Tests share just a few Test Steps from another Test and don't share any other test information.
Note: Test Set Instances are not real tests because they are not created in the Master Test Library and can only be created in a Test Set. They only share the lifetime of the Test Set and hence are useful because they don't clutter your Master Test Library with test variations that are only relevant for one round of testing.
Versions prior to 1.7.0 had Test Set Instances that were previously called just Instances.
Related tests can use Template Variables to assist in reusing the same test or steps with minor variations based on selected Custom Fields. Template variables allow the Custom Fields set on each test to display in the text fields, e.g. the Description field on a test step might be "Install on {{OS}}" which uses the OS template variable. On each copy of the test the variable will be replaced with the OS custom field value set for that test, e.g. Windows8 on one test and Windows7 on a Related Test.
Related
Tests may be run
in the same way that any other test is run.
Coupled Tests share the same Test Steps as their parent test, but allow Test Properties and Test Planning differences.
Coupled
tests
should be
created where the Template Test and its Coupled tests are likely to be
used across
several
different projects and are therefore relevant to several different Test
Sets. Coupled Tests are always automatically added to the Master Test
Library whether you create them in the Library or in a Test Set.
To
Create a Coupled Test
1. Right click a Test in the Active Set Panel to both select the Test and display menu options.
2. Select Create Related Test. A second drop down level will appear.
3. Select New Coupled Test from this second drop down. A new test will appear in the tree-view named as {Coupled: <Test name>}.
The Test Definition of the new Coupled test will appear in the Work Area. The steps will be identical to the original test, except that they will now be “Calls” rather than “Steps”. The only Step in the Coupled Test is a step that calls the Steps from the original. These steps from the original are now numbered 1-1, 1-2, etc rather than 1, 2, etc.
Note: The Edit Test button is still active, but only a limited number of functions remain available for editing the new Coupled test. Clicking the Edit Test button allows Call notes to be added to the description of the remaining Step, but the description and expected results of the various 'called' Steps can no longer be changed.
To
Edit the Planning of a Coupled Test
1. With the Coupled Test selected in the Active Set panel, click the Planning tab at the top of the Work Area.
2. Click the Edit Test button.
3. Change the Assignee, Environment or General Custom Fields as required.
4. Click the Save button at the bottom right of the Work Area to save your changes and exit the editing mode.
5. Alternatively, click the Cancel button at the bottom right of the Work Area to exit without saving.
Note: While in the Editing Mode, it is possible to switch between the Test Definition, Test Properties and Planning tabs, but only some features may be edited; Test Properties and Test Planning, and the Call Notes on the calling Test Step. At any time you need to click Save or Cancel to exit the Editing Mode.
Test Set Instances can only be accessed in the Test Set in which they were created, and are used when the Related Test is only to be run for one particular project. Use of Instances helps to avoid cluttering the Master Test Library with copies of very similar tests.
Test Set Instances are copies of the original test created only in the Test Set that cannot be edited. However, their Planning can be edited, allowing the assignment of different environments to different Instances. This allows, for example, identical tests to be run on different operating systems and for the results on each system to be separately recorded.
To
Create a Test Set Instance
1. Right click a Test in the Active Set Panel to both select the Test and display menu options.
2. Select Create Related Test. A second drop down level will appear.
3. Select New Test Set Instance from this second drop down. A new test will appear in the tree-view named as [1] <Test name>.
The Test Definition of the new Instance will appear in the Work Area, but in this case the Steps will be identical to the original. The Test Set Instance is an exact copy of the original Test.
Instances are particularly useful for running the same test across multiple operating systems and recording the results separately. It is also possible to assign different testers for different instances, or to note other characteristics such as reviewed status.
To
Edit the Planning of a Test Set Instance
1. With the Instance selected in the Active Set panel, click the Planning tab at the top of the Work Area.
2. Click the Edit Test button.
3. Change the Assignee, Environment or General Custom Fields as required.
4. Click the Save button at the bottom right of the Work Area to save your changes and exit the editing mode.
5. Alternatively, click the Cancel button at the bottom right of the Work Area to exit without saving.
Note: While in the Editing Mode, it is possible to switch between the Test Definition, Test Properties and Planning tabs, but only Planning Tab features may be edited. At any time you need to click Save or Cancel to exit the Editing Mode.
Linked Tests have one or more steps that "Call" test steps from other tests. They can also have their own plain test steps. The resulting mixture of test steps provides a useful test in its own right, which shares a full test or just a group of steps from another test. This can be useful where a number of tests share common steps, for example, common pre-test setup steps could be shared. Linked tests can be edited under the Test Definition, Test Properties and Planning tabs. This allows you to add or take away steps, alter the Test Purpose, add notes, change the Test Property Custom Fields or change the Test Planning. As with Coupled Tests, this allows different Linked Tests to be, for example, set to use different operating systems via the Test Property Custom Fields. In addition the ability to edit Test Definition steps allows tests to be extended or shortened as changes are made to the software being tested.
The ability to edit Test Definition steps sets Linked Tests apart from Coupled tests where only Test Property Custom Fields can be set differently to the parent Test or Test Set Instances where only Test Planning can be set differently to the parent.
To make a Linked Test, you have two option. You can,
A. Start by creating a test and then add in a "Calling" test step using the New Linked Step... menu item on the right click menu on the test steps list. It will display the Master Test Library of tests for you to select the Test or Test Step(s) you wish to link to. Or,
B. You can right click on a test and select New Linked Test from the Related Tests menu. One Called step will be inserted which calls all the steps in the test you had selected.
Either way, you can then edit the Linked Test to add in your non-shared steps.
Linked Tests can be edited fully (other than the called test steps' fields such as Description and Expected Result) and are completely separate from the Tests they are linked to.
When you select test steps to be called from another test, you can insert one Calling step for each step in the called test or you can have just one Calling step which calls all or a number of the called tests steps. Sometimes it is preferable to have one Linked step for each shared step (one to one relationship) but generally using one Linked step to call all the shared steps makes the test more usable when running the test.
When you run a Linked test you can
set a step result for each of the called steps. You do not
set a result on the Calling step and it will always get marked with a
NA (Not Applicable) status.
To
Create a Linked Test
1. Right click a Test in the Active Set Panel to both select the Test and display menu options.
2. Select Create Related Test. A second drop down level will appear.
3. Select New Linked Test from this second drop down. A new test will appear in the tree-view named as Linked: <Test name>.
The Test Definition of the new Linked Test will appear in the Work Area. The steps will be identical to the original test, except that they will now be “Calls” rather than “Steps”. However, unlike Coupled Tests (above) there are still plain Steps and these can be added to, removed or the order of steps can be changed. These steps from the original are now numbered 1-1, 1-2, etc rather than 1, 2, etc.
To
Edit a Linked Test
1. Click the appropriate tab (Test Definition, Test Properties, or Planning) at the top of the Work Area.
2. Click the Edit Test button.
3. Edit the Steps or their order, Call Notes (under the Test Definition tab), edit the Test Purpose, Test Notes, and Environment and General Custom Fields (Test Properties tab), and the Environment and General Custom Fields and Planning notes (Planning tab) as desired. Jira tickets may also be added or edited.
4. Click the Save button at the bottom right of the Work Area to save your changes and exit the editing mode.
5. Alternatively click the Cancel button at the bottom right of the Work Area to exit without saving.
Note: The Edit Test button is fully active for Linked Tests, meaning that a Linked Test could ultimately be edited to the extent that it bears no resemblance to the original. However, it is not possible to edit or delete 'Called Steps'. To delete these, the Step that makes the “call” must be deleted.
To
Add New Steps to a Linked Test
1. Click on the Linked Test you wish to edit.
2. Click the Test Definition tab at the top of the Work Area.
3. Click the Edit Linked Test button.
4. Right click on any Step or Call in the Step box. A drop down menu will appear.
5. Click New Step. This will create a new step at the end of the list of steps.
6. Click in the Description box and type in a description of the new test step.
7. Click in the Expected box and type in the expected result of the new test step.
8. Click the Save button at the bottom right of the Work Area to save your changes and exit the editing mode.
9. Alternatively, click the Cancel button at the bottom right of the Work Area to exit without saving.
To
Add Steps from another Test to a
Linked Test
1. Click on the Linked Test you wish to edit.
2. Click the Test Definition tab at the top of the Work Area.
3. Click the Edit Linked Test button.
4. Right click on any Step or Call in the Step box. A drop down menu will appear.
5. Click Linked Step. A second drop down menu will appear.
6. Click New Linked Step on this drop down. Test Caddy Explorer will open in a new window, showing the Master Test Library and the Tests within it grouped by Functional Area.
7. Navigate to the desired Test and then test step by clicking on the entries within each level. (Go back by clicking on the appropriate entry in the bread crumb trail at the top of the window.)
8. Check the checkbox(es) next to the desired Test or Test Steps (There is no need to check each step if you want all of them)
9. Click the Select button to add the selected Test or test step to the Linked Test.
10. Click the Save button at the bottom right of the Work Area to save your changes and exit the editing mode.
11. Alternatively, click the Cancel button at the bottom right of the Work Area to exit without saving.
Note: The Step that has been linked cannot be edited directly as it is now a Call. The new step that has been created actually calls the linked step from another Test, and it is only this new step which can be edited, using the right click menu to edit , delete, or move the step up or down.
It is important to note that all the steps from one test can be added to a Linked Test. In the Test Caddy Explorer window, navigate to the level where individual tests are displayed then click on the Check box to select a Test. Then click the Select button. A new step will be added to the Linked Test which calls all the steps from the selected Test. However, these 'Calls' cannot be edited or their order changed. If you wish to change their order then each step must be added separately.
As with Coupled Tests, Linked Tests can also have their Properties and Planning edited, or Jira tickets can be added.
Any Related Test (including the base Template Test) can accept Custom Field Template Variables in any of its text fields.
Example:Related Tests offer code completion menus when you type the opening bracket character { in their text fields. The code completion menu allows you to choose from your available Custom Fields and insert them as a Template Variable into a text field for replacement.
Note: To enable Template Variables to work on the parent test called from any related test it must be marked
as a
Template test. This is
usually
done automatically when you create the related test, e.g. any
test which is called from a Linked Test, is the parent of a Coupled Test
or Instance automatically turns into a Template Test. If you create a new parent test and
have not
yet created a related test you can turn on variables manually by
clicking the Template button.
Advanced: You can also select the New Variable option on the code completion menu to create your own Test Specific
Custom Fields that are
only available on the test you created them on (or any Related
Tests). They can be edited and values added in the Custom Fields
Manager. They are located under the Generic Custom Fields>Test
Specific item in the Custom Fields tree in the manager.
View > Layout allows you to set your Layout preferences. For example you can resize windows and panes (panels) to suit your work style, then at any time:
Configuration > Settings. The Settings dialog box lists all settings for the current user, with options to also show settings that apply to other users or to all (useful for the Team Leader to view). All settings are readily edited by the user; or returned to default values.
Note: It is unusual to need to edit settings in this way. Most settings are set by TestCaddy for you as you use the product.
Some unusual settings can be found in the Settings dialog. For example:
Data from TestCaddy can readily be exported in CSV
format.
Select any Test Set and, using the right click menu, select Export, and
confirm where you would like the data to be filed.
Alternatively, select any Folder and use the right
click menu to export all TestSets
Folder.
The data exported will be in CSV (.csv) format and Excel (.xls)
format. A MHTML version of the Excel format will also be created
(.xls.mhtml). The CSV format is optimized for importing
into another tool. The Excel format is optimized for readability.
Tip: Data exported to CSV will include testing history. However, should you prefer Tests to be exported without history, select the Master Test Library, right click on the top node in the Active Set Panel and choose the Export All Tests in Set option.
TestCaddy is designed so that the screen can be navigated and all functions performed using the keyboard. Testers no longer need to perform thousands of mouse clicks!
Your keyboard can be used to:
Here is a list of the most common TestCaddy shortcuts and the locations in which they can be used.
Ctrl-C Copies your selection to your clipboard. This can be used to make duplicates of many TestCaddy features.
Ctrl-X Removes your selection and places it on your clipboard. This can be used to move many TestCaddy features.
Ctrl-V Pastes the data from your clipboard to your selected location.
Note: For a
full list of
TestCaddy shortcuts refer to Appendix 1 (Full List
of
TestCaddy Shortcuts)
TestCaddy is designed to integrate closely with Jira, a common bug-tracking tool. TestCaddy has been tested on versions 4.1 and above.
The Jira Integration feature is not enabled by default. To enable Jira Integration ensure that Jira is setup to allow access to the API (Accept remote API calls). Then in TestCaddy click on the File > Jira Configuration Options, and TestCaddy will ask if you wish to enable Jira Integration for all users of the current product database. Note that each product database has its own Jira Configuration settings, so that different products can use different Jira installations.
Note: TestCaddy uses the Internet Explorer component to access Jira, so you will need to update your Internet Explorer to use Jira. Jira 5 requires Internet Explorer 8 or higher.
Users can be required to authenticate with their Jira username and password. Before applying this setting, ensure that your Jira server information and login credentials are correct; you can be locked out of your product database if these are incorrect. To enable this setting you must click on Configuration > Settings. A list of settings will be displayed. Scroll down until you find "Authentication, JIRA". Select this line and click the Edit button. Confirm that you wish to change the value and, after restarting TestCaddy, Jira Authentication will be enabled on login.
Note: Once Jira Authentication is enabled the Jira URL cannot be edited under the Jira Configuration options.
Note: The filter value is set for each user under Configuration > Settings > Jira, Filter Regexes.
In addition, the Dashboard can be used to produce new, customized reports or graphs and export these to a file, email or website. This allows you to display the status of a Project to colleagues, clients or customers.
This TestCaddy User Manual is available on a Menu tab in the Work Area (Help > User Manual). There are two options, a local version of the manual for quick access, or a link to the online version for latest updates. As with other TestCaddy tabs you can drag it so that the window floats and is always on top.
The TestCaddy Release notes for your latest build can be found here, or in <InstallationFolder>\TestCaddyManual\ReleaseNotes.html.
The TestCaddy FAQ (Frequently Asked Questions) section of our website contains useful information for every step of TestCaddy usage. If you have any issues with TestCaddy please check these FAQs to see if your problem can be easily resolved before contacting us. The FAQ can be accessed here.
Shortcut | Edit/Run Test mode availability | Project Test Sets availability | Active Set availability | Master Test Library availability |
Ctrl-C | Copy selected text to Clipboard | Copy selected Folder or Test Set | N/A | Copy selected FA or Test |
Ctrl-X | Cut selected text to Clipboard | Cut selected Folder or Test Set | N/A | Cut selected FA or Test |
Ctrl-V | Paste text from Clipboard | Paste a copied/cut Folder or Test Set | N/A | Paste a copied/cut FA or Test |
Ctrl-N | Creates a new Test Step | Creates a new Test Set | Creates a new Test | Creates a new Test |
Ctrl-Del | Deletes selected Test Step | Deletes selected Folder or Test Set | N/A | Deletes Selected FA or Test |
Ctrl-D | Moves selected Test Step down | Moves selected Folder or Test Set down | N/A | Moves selected FA or Test down |
Ctrl-U | Moves selected Test Step up | Moves selected Folder or Test Set up | N/A | Moves selected FA of Test up |
F2 | N/A | Renames selected Folder or Test Set | Renames selected FA or Test | Renames selected FA or Test |
Esc | Cancels changes | Cancels rename | Cancels rename | Cancels rename |
Ctrl-S | Saves changes | N/A | N/A | N/A |
Ctrl-M | N/A | Assign Test to current user | N/A | N/A |
Ctrl-End | N/A | N/A | Removes selected Test(s) from Test Set | N/A |
Ctrl-P | Sets selected Test Steps as Passed | N/A | N/A | N/A |
Ctrl-F | Sets selected Test Steps as Failed | N/A | N/A | N/A |
Ctrl-E | Sets selected Test Steps as Deferred | N/A | N/A | N/A |
Ctrl-L | Sets selected Test Steps as N/A | N/A | N/A | N/A |
Ctrl-G | Sets selected Test Steps as Not Run | N/A | N/A | N/A |
Ctrl-Alt-P | Sets all Test Steps as Passed | N/A | N/A | N/A |
Ctrl-Alt-F | Sets all Test Steps as Failed | N/A | N/A | N/A |
Ctrl-Alt-E | Sets all Test Steps as Deferred | N/A | N/A | N/A |
Ctrl-Alt-L | Sets all Test Steps as N/A | N/A | N/A | N/A |
Ctrl-Alt-G | Sets all Test Steps as Not Run | N/A | N/A | N/A |
Ctrl-Z | Undoes latest change (in Editor Field) | N/A | N/A | N/A |
Ctrl-Y | Redoes latest undo (in Editor Field) | N/A | N/A | N/A |
Ctrl-B | Sets text to Bold (in Editor Field) | N/A | N/A | N/A |
Ctrl-I | Sets text to Italic (in Editor Field) | N/A | Creates an instance of selected Test | N/A |
Ctrl-U | Sets text to Underline (in Editor Field) | N/A | N/A | N/A |
Ctrl-A | Select all text | Select all Folders and Test Sets | Select all Functional Areas and Tests | Select all Functional Areas and Tests |
Ctrl-R | N/A | Refresh | Refresh | Refresh |
Tab | Move forward to next field | N/A | N/A | N/A |
Shift-Tab | Move backwards to previous field | N/A | N/A | N/A |
Ctrl-1 | Focus on Projects | Focus on Projects | Focus on Projects | Focus on Projects |
Ctrl-2 | Focus on Active Set | Focus on Active Set | Focus on Active Set | Focus on Active Set |
Ctrl-3 | ||||
Ctrl-4 | Change to layout 1 | Change to layout 1 | Change to layout 1 | Change to layout 1 |
Ctrl-5 | Change to layout 2 | Change to layout 2 | Change to layout 2 | Change to layout 2 |
Ctrl-8 | Enable transparent windows | Enable transparent windows | Enable transparent windows | Enable transparent windows |
Ctrl-9 | More Transparent | More Transparent | More Transparent | More Transparent |
Ctrl-0 | Less Transparent | Less Transparent | Less Transparent | Less Transparent |
↑ |
Navigate up | Navigate up | Navigate up | Navigate up |
↓ | Navigate down | Navigate down | Navigate down | Navigate down |
→ | Navigate right | Expand node | Expand node | Expand node |
← |
Navigate left | Contract node | Contract node | Contract node |