Oslo QA Hackathon 2008 : Topics

From Perl QA
Jump to: navigation, search

Contents

Overview

There is no set agenda for the hackathon. Rather, we'll try make it an Unconference or Open Spaces event.

Tell about what you'd like to discuss and work on at the hackathon! We'll use this page as a base for deciding the schedule at the first day of the event. See Oslo QA Hackathon 2008 : Schedule for details as they emerge.

If you haven't yet registered, be sure to add yourself to the Oslo QA Hackathon 2008 : Attendees page.

For other info about the event, see the Oslo QA Hackathon 2008 page.

Topics

Huge interest (@n >= 5)

Module::Build

Abstract
  • Hammer out lingering issues with Module::Build
  • Add anything important that MakeMaker can do that it cannot
  • A Module::Install-like API for writing Build.PL?
  • Find ways for a more rapid increase of adoption of Module::Build in the Perl community.
  • Make it easier for creating package metadata that is usable in other packaging systems.
  • put configure_requires into META.yml (if SVN trunk doesn't do that already)
repository
Interested
  • Schwern, rjbs, sjn, David Golden, Thomas Klausner, AdamKennedy

Testing Best Practices

  • include author tests in dist or not?
  • decide on ENV vars to trigger extensive / author testing
  • what to put into xt/
  • How to skip tests (Pod::Coverage etc) if module is not present
  • How to show what the "Best Practices" are - at any time, to anyone
  • How to install files not typically in CPAN distributions? (e.g. config files in /etc, Sys-V init scripts, application-specific templates, configuration examples or third-party config files, Windows Start Menu entries and icons, etc.)
  • Portability -- avoiding signals and other IPC, filenames, not assuming unix command line tools, manual prompting
Interested
  • Thomas Klausner, sjn, AdamKennedy, David Golden, rjbs, Ovid, andremar

Strong interest (@n == 3 || @n == 4)

CPAN6

Abstract
Rethinking CPAN. Adding modern concepts and Perl6 needs to our beloved software infrastructure, resulted into unexpected new concepts in data-sharing.
Description
Last two years, a new software distribution system is being developed, which may become a follow-up for CPAN, but may be not: it first has to prove itself. By abstracting the distribution process, many existing data sharing schemes came together: sharing Perl modules, sharing photos within a family, maintaining ftp-servers, document-flows in a company.
CPAN is beautiful in its freedom of sharing, but should adapt better to changing world. Were will we store our Perl6, Parrot, and Parrot related code? How do we publish results of additional services, like cpan-testers? How do we guarantee that the author of the new version of a module is the same as for the old version? Perl6 has a different versioning structure, which CPAN cannot handle. Quite some special case situations currently require manual intervention from Andreas.
CPAN6 has a few very new sharing concepts, afaik not available in any existing application. For the Perl community, it is important to understand that it is only about the distribution process of packages: from upload by the author (pause) until download by the user. It is not re-inventing packaging or installation tools, but on the level of transport protocol.
When developing a large new system, with wishes for pluggable modules contributed by various people, you discover that Perl5 is actually quite poor. CPAN6 development therefore first focused on general purpose building bricks:
  • XML::Compile and XML::Compile::SOAP, because the meta-data will be XML based
  • Log::Report, because translation must be simple
  • OODoc, to nicely integrate (especially complex object oriented) manual pages
There are also a few hundred pages of detailed design documents available from http://cpan6.org There are also a few presentations on the subject, 30, 45 and 90 minutes long.
Interested
  • MarkOv, Jonathan Worthington, AdamKennedy

CPANTS

  • enhance current metrics (test_pod, prereq_matches_use etc)
  • add new metrics
  • review existing metrics (POD coverage in particular)
  • add an API-thingy to the cpants.perl.org
  • Define and add Kwalitee metrics to CPANTS:
    • easily_repackagable_for_debian
    • easily_repackagable_for_fedora
    • easily_repackagable_for_freebsd
    • easily_repackagable_for_*
    • easily_repackagable
  • See CPAN::Porters for initial description of what they should cover
  • Improve metrics:
    • has_humanreadable_license should check texts of licenses,
    • might also accept Copyright section and not only LICENSE section ?
    • has_bugtracker_url
    • has_mailinglist (Do we really need 11,000 mailing lists?)
    • has_searchable_mailinglist_archive
    • has_sourcecode_repos
    • has_keywords
  • See further issues in the http://cpants.googlecode.com/svn/trunk/Module-CPANTS-Analyse/TODO
Interested
  • Thomas Klausner, Gabor Szabo, sjn

FreeBSD ports, Perl module dependencies

Abstract
Specification of Perl module dependencies in the FreeBSD ports infrastructure needs improvement.
  • PERL_DEPENDS
  • PERL_TEST_DEPENDS
  • Dual-life modules
Interested
  • Lars Thegler, tobez, des

Nested TAP

Interested
  • Schwern, rjbs, Ovid

Pair interest (@n == 2)

CPAN.pm hacking

Abstract
Bug fixing, improving (or just documenting) the API, coverage (?!), or whatever else comes to mind.
Interested
David Golden, brian d foy

CPAN Testers 2.0

Abstract
CPAN Testers has many known flaws (e.g. email as transport, inconsistent client tools, no author contact preferences). I propose that the workshop assemble a team of developers for a focused sprint to release a "2.0" architecture for CPAN Testers
Goals
Ideaally, by the end of the workshop and conference, the following should be released and in-use by at least one high-volume smoke tester for end-to-end CPAN Testers reporting:
  • Web API to receive CPAN Testers reports via HTTP (Ask at Perl NOC already has an alpha version in place)
  • Database backend for central storage of reports (ditto Ask)
  • Web API to query central database of reports (for analysis or reporting)
  • Centralized notifications service to email authors about new reports
  • Website to display report summaries (and drill down to individual reports) by author or distribution
  • CPAN libraries for clients to generate, serialize and transport reports
  • Developer release of at least one end-user tool for submitting test reports using new transport
  • Legacy gateway to capture reports submitted to perl.cpan.testers into new database
Bonus goals, time permitting:
Any subset of the above would be a wonderful step in the right direction
Workstreams
Achieving the goals will likely mean breaking up into teams to pursue components of the overall solution. Exact sequencing to be determined. Developers will likely rotate from team to team as early or easy workstreams are completed and we see where the real challenges are.
  • Clients and transport team
  • Database and web API team
  • Notifications and reporting team
  • Legacy migration team
Participants
Open to suggestions on number of developers needed. Would need to have at least several people familiar with the existing CPAN Testers architecture and tools. Would also need several people capable of quickly building and deploying the supporting web and database applications.
Repositories
Interested
  • David Golden, rjbs

Improve Parrot's smoke testing

Abstract
The Parrot project currently relies on CPAN modules Test::TAP::Model and Test::TAP::HTMLMatrix to enable developers to submit smoke reports to our smoke server. This dependency is found in Parrot::Harness::Smoke -- but the code in that module immediately overrides Test::TAP::Model::run_tests()! So the code is hack-ish. Moreover, it has been criticized on the grounds that it posts HTML directly to our smoke server. So, for several months we have been looking to develop a better smoke testing system and replace our reliance, to the extent possible, on these two modules. Any assistance by Oslo QA Hackathon participants would be appreciated.
Repository
Interested
  • Jim Keenan (not attending), Michael Peters, Gabor Szabo

I would be interested in separating this from Parrot so the same test running report generate and collect nice HTML report for any TAP producing project. I am not sure if this is not exactly what Smolder is about? Szabgab 13:23, 3 April 2008 (BST)

Spec for non-perl dependencies

Abstract
META.yml only allows for specification of Perl module dependencies. As a result, authors have all sorts of crazy ways to check for non-perl dependencies (binaries, libraries, whatever) in their *.PL files. Neither CPAN nor CPAN Testers deal well with this (Devel::CheckLib is a nasty, brilliant hack). We really need to come up with a standard for how to represent these dependencies and perhaps even for how to resolve them.
Interested
  • David Golden, AdamKennedy

TAP Meta Information

There are a number of TAP propsals related to meta information which most agree are a Good Idea. Let's hammer out the last details and do some implementing.

  • Specify some keywords
  • Consider cross-platform issues
  • Consider cross-language issues
  • Consider non-traditional uses (example: TAP over http)
  • Agree on keyword extension mechanism
  • Modify TAP::Parser to handle
  • Modify Test::Builder to produce
Interested
  • Michael Peters, Ovid

Test::Builder 2

Rewrite Test::Builder to support advanced testing libraries

Abstract
  • Test function begin/end actions
    • Die on fail
    • Debug on fail
  • Continuous testing (see Test::AtRuntime)
  • Pluggable outputs (TAP or XML or multiple TAP versions...)
  • Global vs local test configuration changes to allow multiple modules to customize without stepping on each other
  • Rewrite Test::Builder in terms of TB2
  • Separate global properties (plan, test counter) from local properties
  • Support for threaded and forking tests
  • Support for running tests in a sub process
  • YAML diagnostics
  • Steal ideas from Ruby, Java, Smalltalk, Perl 6, etc...
repository
Interested
  • Schwern, Ovid!!!


GUI to display TAP results

  • Smolder?
  • Web gui?
  • Gtk based?
Interested
  • Ovid, Øystein Torget (web gui)

Perl6 spec test suite

The official Perl6 test suite now lives in the pugs source code repository. A big refactoring of this test suite is in progress, but needs more people to work on. This spec test suite in the end will define what Perl6 is, and if an implementation can be named Perl6. It's somewhat a simple job, and can be interesting to get familiar with the various synopsis.

Interested
  • Cosimo, Øystein Torget

Revise TAP Specification

The TAP specification is incomplete and in dire need of updating.

  • Complete the spec
  • Add in accepted extensions (TAP version, TAP diagnostics)
  • Write a grammar
Interested
  • Ovid, Øystein Torget

Testing WebApps with Catalyst and Selenium

Selenium is a portable software testing framework for web applications. The tests can be written as HTML tables or coded in a number of popular programming languages and can be run directly in most modern web browsers. There is a Catalyst plugin for selenium (http://search.cpan.org/~jrockway/Test-WWW-Selenium-Catalyst-0.02/lib/Test/WWW/Selenium/Catalyst.pm), but it's kind of hard to get set up and working. I would like to work on making it easier to test web apps with Selenium, and to use the open source catalyst-based wiki MojoMojo as a test bed for this work.

Interested
  • Marcus Ramberg, Trond Michelsen (mostly interested in Selenium)

Universal TAP interpreter test suite

It's easy to write a TAP producer but difficult to write a TAP interpreter. Part of the problem is the unclear spec, but there's a lot of edge cases to deal with. A universally applicable test suite for TAP would help.

A test can be made up of...

  • TAP input
  • How the parser should interpret it, probably in YAML.
  • Extra tests for things like exit codes.
  • Levels of acceptance.

Ideally the tests would be embedded in or refer directly to sections of the spec.

Interested
  • Michael Peters, Øystein Torget

Solo interest (@n == 1)

Better CPANPLUS::Dist::Deb

CPANPLUS::Dist::Deb lets you easily build debian packages from CPAN. The version today offers minimal control over the debian-part of it (postinst, preinst and other control-files). I would like to add support for such things, to be able to more flexibly control the packages created

Interested
  • andremar

Community support measures

Abstract
Let's attempt a sociological look at the CPAN community. There is lots of programming competencty out there that is underutilized, untapped or unappreciated. What can we do to improve on this? Interesting questions to take this further could be:
  • How can we make it easier to find relevant modules?
  • How can we improve on the serendipity of CPAN?
  • What about module authoring - how can we convince people that the CPAN way of making Perl module distributions is a good way?
  • Where can we find new avenues for communicating QA issues and/or best practices to old and new Perl programmers?
This is mainly a brainstorming session, but hopefully we can bring good ideas to the other topics of the hackathon.
Interested

CPAN::Porters

Interested
  • Gabor Szabo

CPAN::Reporter

Abstract
Notwithstanding CPAN::Testers 1.0 or 2.0 work, CPAN::Reporter has a growing Todo backlog and RT issue list. Examples include:
  • Add timestamps to log file
  • Track prerequisites (for failures or for general statistics)
  • Better config_requires handling
  • Send first/last 25K of output instead of first 50K of output
  • Figure out how to pass through Test::Harness color formatting :-)
  • Test coverage
  • Refactor test suite
Interested
  • David Golden

CPAN Testers 1.0

Interested
  • Gabor Szabo

TAP output from XUnit of Java and .NET

Interested
  • Ovid

TAP producer for Lua

  • Finish the TAP producer for Lua I started
Interested
  • Gabor Szabo

Just ideas

Finish or Kill TAP Proposals

There's a number of TAP Proposals which have been hanging around in limbo. We should decide whether they're a good idea or not and finish anything blocking their acceptance.

perlbench

perlbench has a long history of gripes against it, mostly with respect to the noise in the measurements, and the resulting uncertainty whether any given patch improves perl's performance. It would be great if participants found time & energy to:

  • identify specific complaints regarding what perlbench does, and how it does it.
  • characterize noise by testing the same perl exe 2-6 times in a single run.
  • do a similar characterization of Benchmark.pm, mostly to provide some reference.
  • review and describe how perlbench determines baseline, noop costs.
  • apply 'Confidence Intervals' (IIRC) to determine how much to extend the runtime so that noise in each test can be reduced to arbitrary limits.
  • identify other sources of benchmarks: language-shootout and spam-assassin come to mind.
  • add knowledge gained into a new perlbench-NG page.
Personal tools