The following is a listing of the events for the 2004/2005 season including presentation abstracts and speaker biographies. Please see the 2004/2005 Season page for the details on the schedule, location and sponsor(s).
Much of the buzz this year at the XP Agile Conference was "How do we get agile accepted into mainstream?". Some ideas presented by keynote speakers were carried throughout the conference. This was not a planned theme, but it shows where agile methods are headed. I will summarize some of the ideas presented about this topic and highlight some of the other sessions.
Janet Gregory is the Quality Manager at Wireless-Matrix. Her passion is promoting agile quality assurance processes and she has helped to introduce Agile Development Methodologies into several companies, playing the role of Coach, Iteration Manager,as well as tester. She is one of the organizing committee of CAMUG (The Calgary Agile Methods User Group). Janet has a degree in Computing Science from the University of Alberta, Quality Management Certification from ASQ, and Information Management Certificate from the University of Calgary.
This will be our first open discussion of the new season. The theme is what you consider to be your "essential toolkit" that supports the work that you do. The key to a successful, open discussion is participation, so come prepared to (at least) talk about, and (possibly) demonstrate the tools that you use and consider mandatory to be productive at your job.
This is a good opportunity to share your successes, and failures (yes, we've all made some bad choices) with the group, and to learn about new and different ways of approaching similar problems. Given the diverse nature of the group, expect insights into all facets of software development, quality and management. Above all, come out and participate.
This session will explain how the Postmortem method can be used to give feedback on a project and thereby help an organization learn from what is does well and what it needs to improve. Postmortems (also known as Lessons Learned and Post-Phase Analysis) can be conducted at the end of a project or the end of a phase or outside the work environment such as in volunteer groups. The session will include the theory and an example.
Fiona Koether has been a participant and facilitator for the Software Quality discussion group since it's inception. She works for Nortel in Calgary as a software process specialist. She has 20 years experience in software development in roles ranging from Programming and Systems Analysis to Consultancy and Management.
For our second open discussion session of the year we want to hear your "scariest" or "weirdest" stories about life in the trenches. Was it the pre-release build that crashed everything? What about the user from "hell" that just couldn't get it? What about those projects that we've all been on that went into the "death march" phase, and nobody could figure out why? We want to hear from all of you. Hopefully, we'll all come away with some insight into how we can avoid similar situations, or perhaps we'll just breath a big sigh of relief and mumble, "there, but for the grace of God", etc., etc.. At the very least, we're looking to have some fun here. Costumes are optional, but you get extra brownie points if you come dressed appropriately.
This was a last minute change. The previously schedule topic, "The Role of the Q/A Manager" has been re-scheduled to January 12, 2005. Tonights meeting focused on the content and approaches for ensuring that IT's message is heard, both in terms of making the correct technical decision and around process and improvement.
The Production Accounting System (PAS) Project is a co-venture with Devon Canada, EnCana, Husky Energy, Talisman Energy, and CGI to build, maintain and enhance a single solution to be used by these companies, over the next 10 to 20 years. This solution is intended to increase efficiency, improve the quality of information by providing end users more time for data analysis and ultimately result in the ability to make better business decisions. The new system is being developed as an industry solution that will be available to other oil and gas companies that operate in Canada.
The development work is being carried out by a dedicated on-site team of 30+ professionals, including representatives of the four oil & gas companies, who provide the business expertise. The project began in October 2003 with a definition phase and the 2 year development phase began in April 2004. The application is being developed in a 3+ tier architecture based on J2EE standards, on an Oracle database, utilizing an Agile Development methodology. The governance of the project includes an Advisory Board which meets twice monthly to review progress and give strategic direction and a PMI standards based approach to reporting progress.
This presentation will focus on the challenges faced in applying PMI PMBOK principles to a project using Agile practices such as SCRUM and XP to develop the application.
Colin Cassie is a Senior Project Manager with CGI in Calgary with experience in the past 15 years, as a senior consultant, project manager and executive involved in delivering IT professional services to government. Previously Colin enjoyed a career in construction management working across Canada, and in Africa, Asia, Europe and the Caribbean. Colin came to Calgary in October 2003 to be the Project Manager of a co-venture with CGI and a consortium of major oil & gas companies, to develop a sustainable production accounting solution.
What a great opportunity to meet with fellow quality practitioners to partake of some fine vituals, hoist a few and talk about software quality, amongst other things, I'm certain.
We're meeting at the Auburn Saloon again this year, since our previous outing here was a huge success. The Auburn is located 119 9th Avenue SW. It's on the south side of 9th Avenue, on the west side of the Palliser Square office tower, directly across from the Glenbow Museum.
As always, this is an informal affair. We can start congregating around 5:30 pm and wrap up any time we want. Food will be served starting shortly after 6:00.
Over the past 30 years hardware reliability has improved but software quality has generally remained unchanged with most customers being dissatisfied with the end result of their software project. How are quality attributes, such as performance, reliability, correctness, security and maintainability (to name a few) prioritized and given the attention needed for a successful software project outcome? What quality attributes are important in your projects? How are your processes supporting your quality goals? How can the QA Manager play a role in making sure that the project is delivered, received, tested and validated according to the needs of the client?
Nina Sharpe is the Quality Manager for the PAS project at CGI. She comes from a software development background and is experiencing an Agile project for the first time. On her current project she works closely with both Development and Business to ensure that the customer gets a quality project. Nina has a Political Science and Economics Degree from Western, and an Information Technology certificate from SAIT.
This session will explain how to evaluate an UML object-oriented application design using the COODEM model. This model allows the designer to verify how close it is to a good design according to some criteria considering the design principles and those in the object-oriented paradigm.
The UML graphical notations represent several structures of the system to be developed; the final design should be of excellent quality and widely contribute to the construction of the software. The session will include the theory and some examples.
Agenda
Jenny Antoneti has been a participant in the Software Quality Discussion Group since April 2004. She is a Software Engineering graduate student at Central University of Venezuela. She has more than 20 years experience in software development for several companies in Venezuela and U.S.A., and software engineering teaching and research for several universities and colleges in Venezuela.
Jenny Antoneti is a Bachelor in Computer Science (Central University of Venezuela). She received a Certification as Information Systems Specialist in 1986 from The George Washington University. She received her Master?s Degree in Computer Science in 1998 from Central University of Venezuela. Currently, she is working on her PHD in Software Engineering. Her research topic are software architecture, software quality assurance, object oriented methodologies and software metrics.
SimQuali (TM) will be the world's first true software quality simulator.
Quality is the discriminating factor in the competitive world of software based products and services. Decisions based on sound methodology integrating human and computational intelligence are the best decisions one can make. This is especially true under increasingly hard constraints for competitive market edge and delivery of a quality product. General-purpose certification or assessment approaches fail to provide necessary operational guidance for how to achieve appropriate levels of quality in the presence of budget, resource, and time constraints.
SimQuali (TM) is the product that will provide intelligent decision support for software quality assurance. SimQuali (TM) will allow you to identify various scenarios, evaluate resulting quality levels and help you to determine the appropriate amount of quality effort to ensure quality expectations are meet and budget/schedule constraints achieved.
The purpose of this presentation and discussion is to explore SimQuali (TM), the development approach (collaborative) i.e. what works and what doesn't in terms of moving the "project" forward, and promoting software quality.
As well to explain what the product brings to the software/business processes and benefits. Essentially how SimQuali (TM) promotes the larger goal of enhancing software quality.
Anthony Ebsworth is a Senior Practice Manager with Securac in Calgary, Alberta. Mr. Ebsworth has been involved in Software Quality for the last six years of his 19 years of high tech experience. He is practiced in the development and implementation of QA (and other functions) strategic and operational plans, budgets, team, processes and tools. He has gained his experienced within large multinationals, mid-size firms through to a Web-based-start-up. He is Securac's Project Manager for the development of SimQuali (TM).
Jim McElroy has extensive expertise in software development processes, having been a primary author of two proprietary processes. He is also proficient in requirements engineering, including advanced use case analysis. He also has significant experience in RUP, UML, and transitioning requirements into analysis and design. He holds a B.S. in Computer Engineering from San Jose State University, an M.S. in Computer Science from California State University Chico, and is working on his Ph.D. in Computer Science at the University of Calgary. Jim also has 20 years industrial and teaching experience in the computer field. He is the University of Calgary's Systems Analyst for the development of SimQuali (TM).
EnCana uses a Project Support model instead of a more "traditional" PMO. This presentation will compare the efficacy of the two approaches using the experiences at EnCana as a guide.
Cara Fitsgerald, Group Lead, and Roxanne Becker, Consultant, are with the I/S Project Office at EnCana.
Software quality is the degree to which software possesses a desired combination of attributes (modifiability, security, performance, availability, etc) In this tutorial we describe a few principles for analyzing a software architecture to determine if it exhibits certain quality attributes. We show how analysis techniques indigenous to various quality attribute communities can provide a foundation for performing software architecture evaluation.
Since attributes can interact or conflict improving one attribute often comes at the price of worsening one or more of the others it is necessary to trade-off among multiple software quality attributes at the time the software architecture of a system is specified, before the system is developed.
It is important to point out that we do not aim at an absolute measure of "architecture quality"; rather our purpose is to identify scenarios from the point of view of a diverse group of stakeholders (e.g., the architect, developers, users, sponsors) and to identify risks (e.g., inadequate performance, successful denial-of-service attacks) and possible mitigation strategies (e.g., prototyping, modeling, simulation).
In the tutorial I will describe processes to conduct architecture trade-off analyses developed by the Software Engineering Institute (SEI). The objective of the evaluations is to understand a software architecture's fitness with respect to multiple software quality attributes and to identify sensitivity points, trade-offs, and risks. Sensitivity points are architectural decisions that have significant impact on a quality attribute; trade-off are sensitivity points that affect more than one attribute; risks are potential problem in achieving the desire attributes.
Mario Barbacci recently retired from the Software Engineering Institute (SEI) at Carnegie Mellon University. He was one of the founders of the SEI where he has served in several technical and managerial positions, including Project Leader (Distributed Systems), Program Director (Real-time Distributed Systems, Product Attribute Engineering), and Associate Director (Technology Exploration Department). Prior to joining the SEI he was a member of the faculty in the School of Computer Science at Carnegie Mellon University.
His current research interests are in the areas of software architecture and distributed systems. He has written numerous books, articles, and technical reports and has contributed to books and encyclopedias on subjects of technical interest.
Barbacci is a member of the Institute of Electrical and Electronic Engineers (IEEE) and the IEEE Computer Society, a member of the Association for Computing Machinery (ACM), and a member of Sigma Xi. He was the founding chairman of the International Federation for Information Processing (IFIP) Working Group 10.2 (Computer Descriptions and Tools) and has served as chair of the Joint IEEE Computer Society/ACM Steering Committee for the Establishment of Software Engineering as a Profession (1993-1995), President-Elect, President, and Past-President of the IEEE Computer Society (1995-1997), IEEE Division V Director (1998-1999), IEEE TAB Strategic Planning and Research Committee (2000-2002).
Barbacci is a Fellow of the Institute of Electrical and Electronic Engineers (IEEE) and the recipient of several IEEE Computer Society Outstanding Contribution Certificates, the ACM Recognition of Service Award, and the IFIP Silver Core Award. Barbacci received bachelor's and engineer's degrees in electrical engineering from the Universidad Nacional de Ingenier'a, Lima, Peru, and a doctorate in computer science from Carnegie Mellon.
When asked about project "success", I believe our stock response is; "Why yes, every project I've worked on has come in on time and on budget!" At face value, that is a truthful, albeit potentially misleading, statement since estimates should be continuously re-evaluated as the project proceeds. However, it's also (mostly) true that the measure by which all projects are gauged is how close you were to your initial, "pulled from the aether" guess! On the other hand, how many times have you been derisively dismissed when you've tried to be truthful about what it's really going to take to get the job done. Several cliches immediately come to mind -- pay me now, or pay me later; and rework is apparently cheaper than good work. Soooo.... what are we going to do about it? Here's your chance to compare notes and suggestions about what works, what doesn't and to share how you achieve Nirvana -- the accurate and timely estimate.
As web applications become more complex, the testing time increases, yet frequent builds require more frequent testing. To overcome this, automation must be used. While commercial gui automation tools for web browsers do exist, they are expensive. Non-browser protocol testers ( such as httpUnit) rarely test client side javascript.
To overcome this, the watir (Web Application Testing in Ruby) toolkit has been developed.
This toolkit, which is open source, allows almost all normal interactions with a web browser to be entered into a script, and then executed against the application under test. This tutorial will offer a brief introduction to the ruby scripting language, follwed by a demonstration of the watir toolkit.
Paul Rogers is a professional software tester specialising in test automation in a variety of languages. He uses Ruby extensively as a testing tool in various testing projects, from embedded firmware applications to web applications. He has written a controller in Ruby for testing applications with Internet Explorer, and contributes to the open source WATIR project for testing web applications using Ruby. He is a software tester at Wireless Matrix in Calgary, Alberta.
This presentation will provide an introduction to the use of flowcharts to model process workflows. Visually capture the responsibilities, activities, and artifacts for any business or development process, without putting your audience to sleep.
Lisa Molesky a member of the process improvement group at Telvent. She has 15+ years experience in quality assurance and process improvement in military, aerospace, and commercial environments. Her current work involves close interaction with both development and business groups to optimize processes within Telvent. Lisa is a professional engineer with a degree in Electrical Engineering from the University of Alberta, and is an ASQ Certified S/W Quality Engineer.
Requirements take many forms and have many names, but, at the end of the day, they are just that -- "requirements" -- things that the system has to do to satisfy users' needs. But since we don't know what we don't know, and quite often the users are in the same position, how good are requirements, really, and how much effort should we be putting into getting them right -- whatever that means? Bring your opinions and ideas out into this "open space" and let's see what stands the test of time.
Ensuring long useful lives for hardware and software systems with the inevitable expansions, upgrades, and previously unconsidered interconnections to other systems is an architectural function. The results can be positive, resulting in long, low-cost system life, or negative, leading to a system with significant limitations.
Often neglected are the architectural techniques and concepts, both in terms of what behaviors are specified, and in terms of what areas are left open. The impact of these areas on the longevity of the system life cycle is often not well appreciated.
We will examine how successful architectures have achieved longevity without major incompatible changes. In the end analysis, success for architecture is measured by its ability to assimilate changes in mission, implementation, interconnection, and scope without the need for incompatible changes. Put succinctly, 20 years into an architecture's life, success is measured by the ability of systems implemented on Day One to interoperate unchanged with systems implemented on Day 20369.
Mr. Gezelter is a Contributing Editor for the Computer Security Handbook. He has worked with the Internet and its predecessor, the Arpanet, for much of his career. His experience with the Internet, combined with his extensive experience on security related issues in financial and other areas, resulted in his being invited to author the Internet Security chapter of the Third Edition of The Computer Security Handbook (John Wiley and Sons, Fall 1995) and three Internet-related chapters in the Fourth Edition (John Wiley and Sons, Spring 2002).
He has an extensive background in the design, implementation, and utilization of computer systems. His clientele has spanned the full range of computing activities, from governmental administrative systems to real-time defense and process-control environments. His work has spanned the industry, from mainframes to embedded micro-controllers.
Mr. Gezelter's work has included the internals and utilization of a wide range of architectures and platforms, including the IBM 1620, IBM 1130, IBM System/360/370 (and successors), Digital's PDP-11, VAX, and Digital's (now Hewlett Packard's) ALPHA products, systems based upon Intel 80x86 processors, and others. He has worked with a variety of operating systems including UNIX, OpenVMS, MS-DOS, Windows, and the RSX-11 family. In the networking arena, Mr. Gezelter has over 25 years of experience, including BiSynch, RJE, DECnet and TCP/IP.
Since 1985, he has presented over 125 public sessions and seminars at conferences and symposia spanning the range from one-hour conference presentations to full day seminars. He has been an invited speaker at symposia sponsored by DECUS Canada (Vancouver, Montreal, Calgary, and Toronto), DECUS Europe (Cannes), DECUS-General Area (SiDE, Turkey), and DECUS Switzerland (Zurich, Lausanne, Lugano); in addition to being a regular speaker at the Spring and Fall US DECUS Symposia since 1985. He has also been a featured speaker at meetings sponsored by the Association for Computing Machinery, Institute of Electrical and Electronics Engineers, Information Systems Security Association, EDP Auditors Association, and an invited speaker at NASA's Marshall Space Flight Center. A sampling of the presentations can be found at www.rlgsc.com/presentations.html
.He has published over 25 articles (list attached) in a variety of publications including Hardcopy, Digital News, Computer Purchasing Update, Network Computing, Open Systems Today, Digital Systems Journal, Network World and others. He has served as a Contributing Editor for Hardcopy, Digital News, Computer Purchasing Update, and OpenVMS.org. A selection of his recent articles and columns can be reached at www.rlgsc.com/publications.html.
Since 1978, Mr. Gezelter has been in private practice emphasizing operating systems, networks, and security. His particular focus has been the use of architectures to improve leverage and efficiency while reducing complexity and its attendant hazards.
Mr. Gezelter received his BA and MS degrees in Computer Science from New York University in 1981, and 1983 respectively.
This is our annual planning session where organizers, volunteers and participants, active or otherwise, get a chance to make their mark on next years sessions. Everything is up for discussion. Bring your ideas and suggestions to make this an even better discussion group next year.