Name Description Size
__init__.py 0
archive.py Create a tar file deterministically. Receives a dict mapping names of files in the archive to local filesystem paths or ``mozpack.files.BaseFile`` instances. The files will be archived and written to the passed file handle opened for writing. Only regular files can be written. FUTURE accept a filename argument (or create APIs to write files) 4707
attributes.py Determine whether the given set of task attributes matches. The conditions are given as keyword arguments, where each keyword names an attribute. The keyword value can be a literal, a set, or a callable: * A literal must match the attribute exactly. * Given a set or list, the attribute value must be contained within it. * A callable is called with the attribute value and returns a boolean. If an attribute is specified as a keyword argument but not present in the task's attributes, the result is False. Args: attributes (dict): The task's attributes object. kwargs (dict): The conditions the task's attributes must satisfy in order to match. Returns: bool: Whether the task's attributes match the conditions or not. 2964
cached_tasks.py Allow the results of this task to be cached. This adds index routes to the task so it can be looked up for future runs, and optimization hints so that cached artifacts can be found. Exactly one of `digest` and `digest_data` must be passed. :param TransformConfig config: The configuration for the kind being transformed. :param dict taskdesc: The description of the current task. :param str cache_type: The type of task result being cached. :param str cache_name: The name of the object being cached. :param digest: A unique string identifying this version of the artifacts being generated. Typically this will be the hash of inputs to the task. :type digest: bytes or None :param digest_data: A list of bytes representing the inputs of this task. They will be concatenated and hashed to create the digest for this task. :type digest_data: list of bytes or None 4150
dependencies.py Iterate over all dependencies as ``Task`` objects. Args: config (TransformConfig): The ``TransformConfig`` object associated with the kind. task (Dict): The task dictionary to retrieve dependencies from. Returns: Iterator[Task]: Returns a generator that iterates over the ``Task`` objects associated with each dependency. 2734
docker.py Resolve in-tree prebuilt docker image to ``<registry>/<repository>@sha256:<digest>``, or ``<registry>/<repository>:<tag>`` if `by_tag` is `True`. Args: name (str): The image to build. by_tag (bool): If True, will apply a tag based on VERSION file. Otherwise will apply a hash based on HASH file. Returns: Optional[str]: Image if it can be resolved, otherwise None. 8112
hash.py Hash a single file. Returns the SHA-256 hash in hex form. 1644
keyed_by.py For values which can either accept a literal value, or be keyed by some attributes, perform that lookup and return the result. For example, given item:: by-test-platform: macosx-10.11/debug: 13 win.*: 6 default: 12 a call to `evaluate_keyed_by(item, 'thing-name', {'test-platform': 'linux96')` would return `12`. Items can be nested as deeply as desired:: by-test-platform: win.*: by-project: ash: .. cedar: .. linux: 13 default: 12 Args: value (str): Name of the value to perform evaluation on. item_name (str): Used to generate useful error messages. attributes (dict): Dictionary of attributes used to lookup 'by-<key>' with. defer (list): Allows evaluating a by-* entry at a later time. In the example above it's possible that the project attribute hasn't been set yet, in which case we'd want to stop before resolving that subkey and then call this function again later. This can be accomplished by setting `defer=["project"]` in this example. enforce_single_match (bool): If True (default), each task may only match a single arm of the evaluation. 3348
memoize.py 294
parameterization.py Resolve all instances of `{'relative-datestamp': '..'}` in the given task definition 3392
path.py Like :py:mod:`os.path`, with a reduced set of functions, and with normalized path separators (always use forward slashes). Also contains a few additional utilities not found in :py:mod:`os.path`. 4466
python_path.py Find a Python object given a path of the form <modulepath>:<objectpath>. Conceptually equivalent to def find_object(modulepath, objectpath): import <modulepath> as mod return mod.<objectpath> 1576
readonlydict.py A read-only dictionary. 787
schema.py Validate that object satisfies schema. If not, generate a useful exception beginning with msg_prefix. 8260
set_name.py 1134
shell.py Given a string, returns a version that can be used literally on a shell command line, enclosing it with single quotes if necessary. As a special case, if given an int, returns a string containing the int, not enclosed in quotes. 1321
taskcluster.py Get the current TASKCLUSTER_ROOT_URL. When running in a task, this must come from $TASKCLUSTER_ROOT_URL; when run on the command line, a default may be provided that points to the production deployment of Taskcluster. If use_proxy is set, this attempts to get TASKCLUSTER_PROXY_URL instead, failing if it is not set. 13057
taskgraph.py Tools for interacting with existing taskgraphs. 1969
templates.py Merge dict and arrays (override scalar values) Keys from source override keys from dest, and elements from lists in source are appended to lists in dest. :param dict source: to copy from :param dict dest: to copy to (modified in place) 2139
time.py Convert a string to a json date in the future :param str input_str: (ex: 1d, 2d, 6years, 2 seconds) :returns: Unit given in seconds 3350
treeherder.py Split a symbol expressed as grp(sym) into its two parts. If no group is given, the returned group is '?' 2721
vcs.py Version control system being used, either 'hg' or 'git'. 18019
verify.py Verification that doesn't depend on any generation state. 8518
workertypes.py Since the list of built-in worker-types is small and fixed, we can get away with punning the implementation name (in `taskgraph.transforms.task`) and the worker_type. 2498
yaml.py Parse the first YAML document in a stream and produce the corresponding Python object. 1059