Added submodule contents into tree

This commit is contained in:
darktux 2024-04-05 01:58:27 +02:00
parent 01a752555c
commit 9b991208cd
4934 changed files with 1657477 additions and 5 deletions

658
externals/mbedtls/scripts/abi_check.py vendored Executable file
View file

@ -0,0 +1,658 @@
#!/usr/bin/env python3
"""This script compares the interfaces of two versions of Mbed TLS, looking
for backward incompatibilities between two different Git revisions within
an Mbed TLS repository. It must be run from the root of a Git working tree.
### How the script works ###
For the source (API) and runtime (ABI) interface compatibility, this script
is a small wrapper around the abi-compliance-checker and abi-dumper tools,
applying them to compare the header and library files.
For the storage format, this script compares the automatically generated
storage tests and the manual read tests, and complains if there is a
reduction in coverage. A change in test data will be signaled as a
coverage reduction since the old test data is no longer present. A change in
how test data is presented will be signaled as well; this would be a false
positive.
The results of the API/ABI comparison are either formatted as HTML and stored
at a configurable location, or are given as a brief list of problems.
Returns 0 on success, 1 on non-compliance, and 2 if there is an error
while running the script.
### How to interpret non-compliance ###
This script has relatively common false positives. In many scenarios, it only
reports a pass if there is a strict textual match between the old version and
the new version, and it reports problems where there is a sufficient semantic
match but not a textual match. This section lists some common false positives.
This is not an exhaustive list: in the end what matters is whether we are
breaking a backward compatibility goal.
**API**: the goal is that if an application works with the old version of the
library, it can be recompiled against the new version and will still work.
This is normally validated by comparing the declarations in `include/*/*.h`.
A failure is a declaration that has disappeared or that now has a different
type.
* It's ok to change or remove macros and functions that are documented as
for internal use only or as experimental.
* It's ok to rename function or macro parameters as long as the semantics
has not changed.
* It's ok to change or remove structure fields that are documented as
private.
* It's ok to add fields to a structure that already had private fields
or was documented as extensible.
**ABI**: the goal is that if an application was built against the old version
of the library, the same binary will work when linked against the new version.
This is normally validated by comparing the symbols exported by `libmbed*.so`.
A failure is a symbol that is no longer exported by the same library or that
now has a different type.
* All ABI changes are acceptable if the library version is bumped
(see `scripts/bump_version.sh`).
* ABI changes that concern functions which are declared only inside the
library directory, and not in `include/*/*.h`, are acceptable only if
the function was only ever used inside the same library (libmbedcrypto,
libmbedx509, libmbedtls). As a counter example, if the old version
of libmbedtls calls mbedtls_foo() from libmbedcrypto, and the new version
of libmbedcrypto no longer has a compatible mbedtls_foo(), this does
require a version bump for libmbedcrypto.
**Storage format**: the goal is to check that persistent keys stored by the
old version can be read by the new version. This is normally validated by
comparing the `*read*` test cases in `test_suite*storage_format*.data`.
A failure is a storage read test case that is no longer present with the same
function name and parameter list.
* It's ok if the same test data is present, but its presentation has changed,
for example if a test function is renamed or has different parameters.
* It's ok if redundant tests are removed.
**Generated test coverage**: the goal is to check that automatically
generated tests have as much coverage as before. This is normally validated
by comparing the test cases that are automatically generated by a script.
A failure is a generated test case that is no longer present with the same
function name and parameter list.
* It's ok if the same test data is present, but its presentation has changed,
for example if a test function is renamed or has different parameters.
* It's ok if redundant tests are removed.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
import glob
import os
import re
import sys
import traceback
import shutil
import subprocess
import argparse
import logging
import tempfile
import fnmatch
from types import SimpleNamespace
import xml.etree.ElementTree as ET
from mbedtls_dev import build_tree
class AbiChecker:
"""API and ABI checker."""
def __init__(self, old_version, new_version, configuration):
"""Instantiate the API/ABI checker.
old_version: RepoVersion containing details to compare against
new_version: RepoVersion containing details to check
configuration.report_dir: directory for output files
configuration.keep_all_reports: if false, delete old reports
configuration.brief: if true, output shorter report to stdout
configuration.check_abi: if true, compare ABIs
configuration.check_api: if true, compare APIs
configuration.check_storage: if true, compare storage format tests
configuration.skip_file: path to file containing symbols and types to skip
"""
self.repo_path = "."
self.log = None
self.verbose = configuration.verbose
self._setup_logger()
self.report_dir = os.path.abspath(configuration.report_dir)
self.keep_all_reports = configuration.keep_all_reports
self.can_remove_report_dir = not (os.path.exists(self.report_dir) or
self.keep_all_reports)
self.old_version = old_version
self.new_version = new_version
self.skip_file = configuration.skip_file
self.check_abi = configuration.check_abi
self.check_api = configuration.check_api
if self.check_abi != self.check_api:
raise Exception('Checking API without ABI or vice versa is not supported')
self.check_storage_tests = configuration.check_storage
self.brief = configuration.brief
self.git_command = "git"
self.make_command = "make"
def _setup_logger(self):
self.log = logging.getLogger()
if self.verbose:
self.log.setLevel(logging.DEBUG)
else:
self.log.setLevel(logging.INFO)
self.log.addHandler(logging.StreamHandler())
@staticmethod
def check_abi_tools_are_installed():
for command in ["abi-dumper", "abi-compliance-checker"]:
if not shutil.which(command):
raise Exception("{} not installed, aborting".format(command))
def _get_clean_worktree_for_git_revision(self, version):
"""Make a separate worktree with version.revision checked out.
Do not modify the current worktree."""
git_worktree_path = tempfile.mkdtemp()
if version.repository:
self.log.debug(
"Checking out git worktree for revision {} from {}".format(
version.revision, version.repository
)
)
fetch_output = subprocess.check_output(
[self.git_command, "fetch",
version.repository, version.revision],
cwd=self.repo_path,
stderr=subprocess.STDOUT
)
self.log.debug(fetch_output.decode("utf-8"))
worktree_rev = "FETCH_HEAD"
else:
self.log.debug("Checking out git worktree for revision {}".format(
version.revision
))
worktree_rev = version.revision
worktree_output = subprocess.check_output(
[self.git_command, "worktree", "add", "--detach",
git_worktree_path, worktree_rev],
cwd=self.repo_path,
stderr=subprocess.STDOUT
)
self.log.debug(worktree_output.decode("utf-8"))
version.commit = subprocess.check_output(
[self.git_command, "rev-parse", "HEAD"],
cwd=git_worktree_path,
stderr=subprocess.STDOUT
).decode("ascii").rstrip()
self.log.debug("Commit is {}".format(version.commit))
return git_worktree_path
def _update_git_submodules(self, git_worktree_path, version):
"""If the crypto submodule is present, initialize it.
if version.crypto_revision exists, update it to that revision,
otherwise update it to the default revision"""
update_output = subprocess.check_output(
[self.git_command, "submodule", "update", "--init", '--recursive'],
cwd=git_worktree_path,
stderr=subprocess.STDOUT
)
self.log.debug(update_output.decode("utf-8"))
if not (os.path.exists(os.path.join(git_worktree_path, "crypto"))
and version.crypto_revision):
return
if version.crypto_repository:
fetch_output = subprocess.check_output(
[self.git_command, "fetch", version.crypto_repository,
version.crypto_revision],
cwd=os.path.join(git_worktree_path, "crypto"),
stderr=subprocess.STDOUT
)
self.log.debug(fetch_output.decode("utf-8"))
crypto_rev = "FETCH_HEAD"
else:
crypto_rev = version.crypto_revision
checkout_output = subprocess.check_output(
[self.git_command, "checkout", crypto_rev],
cwd=os.path.join(git_worktree_path, "crypto"),
stderr=subprocess.STDOUT
)
self.log.debug(checkout_output.decode("utf-8"))
def _build_shared_libraries(self, git_worktree_path, version):
"""Build the shared libraries in the specified worktree."""
my_environment = os.environ.copy()
my_environment["CFLAGS"] = "-g -Og"
my_environment["SHARED"] = "1"
if os.path.exists(os.path.join(git_worktree_path, "crypto")):
my_environment["USE_CRYPTO_SUBMODULE"] = "1"
make_output = subprocess.check_output(
[self.make_command, "lib"],
env=my_environment,
cwd=git_worktree_path,
stderr=subprocess.STDOUT
)
self.log.debug(make_output.decode("utf-8"))
for root, _dirs, files in os.walk(git_worktree_path):
for file in fnmatch.filter(files, "*.so"):
version.modules[os.path.splitext(file)[0]] = (
os.path.join(root, file)
)
@staticmethod
def _pretty_revision(version):
if version.revision == version.commit:
return version.revision
else:
return "{} ({})".format(version.revision, version.commit)
def _get_abi_dumps_from_shared_libraries(self, version):
"""Generate the ABI dumps for the specified git revision.
The shared libraries must have been built and the module paths
present in version.modules."""
for mbed_module, module_path in version.modules.items():
output_path = os.path.join(
self.report_dir, "{}-{}-{}.dump".format(
mbed_module, version.revision, version.version
)
)
abi_dump_command = [
"abi-dumper",
module_path,
"-o", output_path,
"-lver", self._pretty_revision(version),
]
abi_dump_output = subprocess.check_output(
abi_dump_command,
stderr=subprocess.STDOUT
)
self.log.debug(abi_dump_output.decode("utf-8"))
version.abi_dumps[mbed_module] = output_path
@staticmethod
def _normalize_storage_test_case_data(line):
"""Eliminate cosmetic or irrelevant details in storage format test cases."""
line = re.sub(r'\s+', r'', line)
return line
def _read_storage_tests(self,
directory,
filename,
is_generated,
storage_tests):
"""Record storage tests from the given file.
Populate the storage_tests dictionary with test cases read from
filename under directory.
"""
at_paragraph_start = True
description = None
full_path = os.path.join(directory, filename)
with open(full_path) as fd:
for line_number, line in enumerate(fd, 1):
line = line.strip()
if not line:
at_paragraph_start = True
continue
if line.startswith('#'):
continue
if at_paragraph_start:
description = line.strip()
at_paragraph_start = False
continue
if line.startswith('depends_on:'):
continue
# We've reached a test case data line
test_case_data = self._normalize_storage_test_case_data(line)
if not is_generated:
# In manual test data, only look at read tests.
function_name = test_case_data.split(':', 1)[0]
if 'read' not in function_name.split('_'):
continue
metadata = SimpleNamespace(
filename=filename,
line_number=line_number,
description=description
)
storage_tests[test_case_data] = metadata
@staticmethod
def _list_generated_test_data_files(git_worktree_path):
"""List the generated test data files."""
output = subprocess.check_output(
['tests/scripts/generate_psa_tests.py', '--list'],
cwd=git_worktree_path,
).decode('ascii')
return [line for line in output.split('\n') if line]
def _get_storage_format_tests(self, version, git_worktree_path):
"""Record the storage format tests for the specified git version.
The storage format tests are the test suite data files whose name
contains "storage_format".
The version must be checked out at git_worktree_path.
This function creates or updates the generated data files.
"""
# Existing test data files. This may be missing some automatically
# generated files if they haven't been generated yet.
storage_data_files = set(glob.glob(
'tests/suites/test_suite_*storage_format*.data'
))
# Discover and (re)generate automatically generated data files.
to_be_generated = set()
for filename in self._list_generated_test_data_files(git_worktree_path):
if 'storage_format' in filename:
storage_data_files.add(filename)
to_be_generated.add(filename)
subprocess.check_call(
['tests/scripts/generate_psa_tests.py'] + sorted(to_be_generated),
cwd=git_worktree_path,
)
for test_file in sorted(storage_data_files):
self._read_storage_tests(git_worktree_path,
test_file,
test_file in to_be_generated,
version.storage_tests)
def _cleanup_worktree(self, git_worktree_path):
"""Remove the specified git worktree."""
shutil.rmtree(git_worktree_path)
worktree_output = subprocess.check_output(
[self.git_command, "worktree", "prune"],
cwd=self.repo_path,
stderr=subprocess.STDOUT
)
self.log.debug(worktree_output.decode("utf-8"))
def _get_abi_dump_for_ref(self, version):
"""Generate the interface information for the specified git revision."""
git_worktree_path = self._get_clean_worktree_for_git_revision(version)
self._update_git_submodules(git_worktree_path, version)
if self.check_abi:
self._build_shared_libraries(git_worktree_path, version)
self._get_abi_dumps_from_shared_libraries(version)
if self.check_storage_tests:
self._get_storage_format_tests(version, git_worktree_path)
self._cleanup_worktree(git_worktree_path)
def _remove_children_with_tag(self, parent, tag):
children = parent.getchildren()
for child in children:
if child.tag == tag:
parent.remove(child)
else:
self._remove_children_with_tag(child, tag)
def _remove_extra_detail_from_report(self, report_root):
for tag in ['test_info', 'test_results', 'problem_summary',
'added_symbols', 'affected']:
self._remove_children_with_tag(report_root, tag)
for report in report_root:
for problems in report.getchildren()[:]:
if not problems.getchildren():
report.remove(problems)
def _abi_compliance_command(self, mbed_module, output_path):
"""Build the command to run to analyze the library mbed_module.
The report will be placed in output_path."""
abi_compliance_command = [
"abi-compliance-checker",
"-l", mbed_module,
"-old", self.old_version.abi_dumps[mbed_module],
"-new", self.new_version.abi_dumps[mbed_module],
"-strict",
"-report-path", output_path,
]
if self.skip_file:
abi_compliance_command += ["-skip-symbols", self.skip_file,
"-skip-types", self.skip_file]
if self.brief:
abi_compliance_command += ["-report-format", "xml",
"-stdout"]
return abi_compliance_command
def _is_library_compatible(self, mbed_module, compatibility_report):
"""Test if the library mbed_module has remained compatible.
Append a message regarding compatibility to compatibility_report."""
output_path = os.path.join(
self.report_dir, "{}-{}-{}.html".format(
mbed_module, self.old_version.revision,
self.new_version.revision
)
)
try:
subprocess.check_output(
self._abi_compliance_command(mbed_module, output_path),
stderr=subprocess.STDOUT
)
except subprocess.CalledProcessError as err:
if err.returncode != 1:
raise err
if self.brief:
self.log.info(
"Compatibility issues found for {}".format(mbed_module)
)
report_root = ET.fromstring(err.output.decode("utf-8"))
self._remove_extra_detail_from_report(report_root)
self.log.info(ET.tostring(report_root).decode("utf-8"))
else:
self.can_remove_report_dir = False
compatibility_report.append(
"Compatibility issues found for {}, "
"for details see {}".format(mbed_module, output_path)
)
return False
compatibility_report.append(
"No compatibility issues for {}".format(mbed_module)
)
if not (self.keep_all_reports or self.brief):
os.remove(output_path)
return True
@staticmethod
def _is_storage_format_compatible(old_tests, new_tests,
compatibility_report):
"""Check whether all tests present in old_tests are also in new_tests.
Append a message regarding compatibility to compatibility_report.
"""
missing = frozenset(old_tests.keys()).difference(new_tests.keys())
for test_data in sorted(missing):
metadata = old_tests[test_data]
compatibility_report.append(
'Test case from {} line {} "{}" has disappeared: {}'.format(
metadata.filename, metadata.line_number,
metadata.description, test_data
)
)
compatibility_report.append(
'FAIL: {}/{} storage format test cases have changed or disappeared.'.format(
len(missing), len(old_tests)
) if missing else
'PASS: All {} storage format test cases are preserved.'.format(
len(old_tests)
)
)
compatibility_report.append(
'Info: number of storage format tests cases: {} -> {}.'.format(
len(old_tests), len(new_tests)
)
)
return not missing
def get_abi_compatibility_report(self):
"""Generate a report of the differences between the reference ABI
and the new ABI. ABI dumps from self.old_version and self.new_version
must be available."""
compatibility_report = ["Checking evolution from {} to {}".format(
self._pretty_revision(self.old_version),
self._pretty_revision(self.new_version)
)]
compliance_return_code = 0
if self.check_abi:
shared_modules = list(set(self.old_version.modules.keys()) &
set(self.new_version.modules.keys()))
for mbed_module in shared_modules:
if not self._is_library_compatible(mbed_module,
compatibility_report):
compliance_return_code = 1
if self.check_storage_tests:
if not self._is_storage_format_compatible(
self.old_version.storage_tests,
self.new_version.storage_tests,
compatibility_report):
compliance_return_code = 1
for version in [self.old_version, self.new_version]:
for mbed_module, mbed_module_dump in version.abi_dumps.items():
os.remove(mbed_module_dump)
if self.can_remove_report_dir:
os.rmdir(self.report_dir)
self.log.info("\n".join(compatibility_report))
return compliance_return_code
def check_for_abi_changes(self):
"""Generate a report of ABI differences
between self.old_rev and self.new_rev."""
build_tree.check_repo_path()
if self.check_api or self.check_abi:
self.check_abi_tools_are_installed()
self._get_abi_dump_for_ref(self.old_version)
self._get_abi_dump_for_ref(self.new_version)
return self.get_abi_compatibility_report()
def run_main():
try:
parser = argparse.ArgumentParser(
description=__doc__
)
parser.add_argument(
"-v", "--verbose", action="store_true",
help="set verbosity level",
)
parser.add_argument(
"-r", "--report-dir", type=str, default="reports",
help="directory where reports are stored, default is reports",
)
parser.add_argument(
"-k", "--keep-all-reports", action="store_true",
help="keep all reports, even if there are no compatibility issues",
)
parser.add_argument(
"-o", "--old-rev", type=str, help="revision for old version.",
required=True,
)
parser.add_argument(
"-or", "--old-repo", type=str, help="repository for old version."
)
parser.add_argument(
"-oc", "--old-crypto-rev", type=str,
help="revision for old crypto submodule."
)
parser.add_argument(
"-ocr", "--old-crypto-repo", type=str,
help="repository for old crypto submodule."
)
parser.add_argument(
"-n", "--new-rev", type=str, help="revision for new version",
required=True,
)
parser.add_argument(
"-nr", "--new-repo", type=str, help="repository for new version."
)
parser.add_argument(
"-nc", "--new-crypto-rev", type=str,
help="revision for new crypto version"
)
parser.add_argument(
"-ncr", "--new-crypto-repo", type=str,
help="repository for new crypto submodule."
)
parser.add_argument(
"-s", "--skip-file", type=str,
help=("path to file containing symbols and types to skip "
"(typically \"-s identifiers\" after running "
"\"tests/scripts/list-identifiers.sh --internal\")")
)
parser.add_argument(
"--check-abi",
action='store_true', default=True,
help="Perform ABI comparison (default: yes)"
)
parser.add_argument("--no-check-abi", action='store_false', dest='check_abi')
parser.add_argument(
"--check-api",
action='store_true', default=True,
help="Perform API comparison (default: yes)"
)
parser.add_argument("--no-check-api", action='store_false', dest='check_api')
parser.add_argument(
"--check-storage",
action='store_true', default=True,
help="Perform storage tests comparison (default: yes)"
)
parser.add_argument("--no-check-storage", action='store_false', dest='check_storage')
parser.add_argument(
"-b", "--brief", action="store_true",
help="output only the list of issues to stdout, instead of a full report",
)
abi_args = parser.parse_args()
if os.path.isfile(abi_args.report_dir):
print("Error: {} is not a directory".format(abi_args.report_dir))
parser.exit()
old_version = SimpleNamespace(
version="old",
repository=abi_args.old_repo,
revision=abi_args.old_rev,
commit=None,
crypto_repository=abi_args.old_crypto_repo,
crypto_revision=abi_args.old_crypto_rev,
abi_dumps={},
storage_tests={},
modules={}
)
new_version = SimpleNamespace(
version="new",
repository=abi_args.new_repo,
revision=abi_args.new_rev,
commit=None,
crypto_repository=abi_args.new_crypto_repo,
crypto_revision=abi_args.new_crypto_rev,
abi_dumps={},
storage_tests={},
modules={}
)
configuration = SimpleNamespace(
verbose=abi_args.verbose,
report_dir=abi_args.report_dir,
keep_all_reports=abi_args.keep_all_reports,
brief=abi_args.brief,
check_abi=abi_args.check_abi,
check_api=abi_args.check_api,
check_storage=abi_args.check_storage,
skip_file=abi_args.skip_file
)
abi_check = AbiChecker(old_version, new_version, configuration)
return_code = abi_check.check_for_abi_changes()
sys.exit(return_code)
except Exception: # pylint: disable=broad-except
# Print the backtrace and exit explicitly so as to exit with
# status 2, not 1.
traceback.print_exc()
sys.exit(2)
if __name__ == "__main__":
run_main()

28
externals/mbedtls/scripts/apidoc_full.sh vendored Executable file
View file

@ -0,0 +1,28 @@
#!/bin/sh
# Generate doxygen documentation with a full mbedtls_config.h (this ensures that every
# available flag is documented, and avoids warnings about documentation
# without a corresponding #define).
#
# /!\ This must not be a Makefile target, as it would create a race condition
# when multiple targets are invoked in the same parallel build.
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
set -eu
CONFIG_H='include/mbedtls/mbedtls_config.h'
if [ -r $CONFIG_H ]; then :; else
echo "$CONFIG_H not found" >&2
exit 1
fi
CONFIG_BAK=${CONFIG_H}.bak
cp -p $CONFIG_H $CONFIG_BAK
scripts/config.py realfull
make apidoc
mv $CONFIG_BAK $CONFIG_H

View file

@ -0,0 +1,534 @@
#!/usr/bin/env python3
"""Assemble Mbed TLS change log entries into the change log file.
Add changelog entries to the first level-2 section.
Create a new level-2 section for unreleased changes if needed.
Remove the input files unless --keep-entries is specified.
In each level-3 section, entries are sorted in chronological order
(oldest first). From oldest to newest:
* Merged entry files are sorted according to their merge date (date of
the merge commit that brought the commit that created the file into
the target branch).
* Committed but unmerged entry files are sorted according to the date
of the commit that adds them.
* Uncommitted entry files are sorted according to their modification time.
You must run this program from within a git working directory.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
import argparse
from collections import OrderedDict, namedtuple
import datetime
import functools
import glob
import os
import re
import subprocess
import sys
class InputFormatError(Exception):
def __init__(self, filename, line_number, message, *args, **kwargs):
message = '{}:{}: {}'.format(filename, line_number,
message.format(*args, **kwargs))
super().__init__(message)
class CategoryParseError(Exception):
def __init__(self, line_offset, error_message):
self.line_offset = line_offset
self.error_message = error_message
super().__init__('{}: {}'.format(line_offset, error_message))
class LostContent(Exception):
def __init__(self, filename, line):
message = ('Lost content from {}: "{}"'.format(filename, line))
super().__init__(message)
class FilePathError(Exception):
def __init__(self, filenames):
message = ('Changelog filenames do not end with .txt: {}'.format(", ".join(filenames)))
super().__init__(message)
# The category names we use in the changelog.
# If you edit this, update ChangeLog.d/README.md.
STANDARD_CATEGORIES = (
'API changes',
'Default behavior changes',
'Requirement changes',
'New deprecations',
'Removals',
'Features',
'Security',
'Bugfix',
'Changes',
)
# The maximum line length for an entry
MAX_LINE_LENGTH = 80
CategoryContent = namedtuple('CategoryContent', [
'name', 'title_line', # Title text and line number of the title
'body', 'body_line', # Body text and starting line number of the body
])
class ChangelogFormat:
"""Virtual class documenting how to write a changelog format class."""
@classmethod
def extract_top_version(cls, changelog_file_content):
"""Split out the top version section.
If the top version is already released, create a new top
version section for an unreleased version.
Return ``(header, top_version_title, top_version_body, trailer)``
where the "top version" is the existing top version section if it's
for unreleased changes, and a newly created section otherwise.
To assemble the changelog after modifying top_version_body,
concatenate the four pieces.
"""
raise NotImplementedError
@classmethod
def version_title_text(cls, version_title):
"""Return the text of a formatted version section title."""
raise NotImplementedError
@classmethod
def split_categories(cls, version_body):
"""Split a changelog version section body into categories.
Return a list of `CategoryContent` the name is category title
without any formatting.
"""
raise NotImplementedError
@classmethod
def format_category(cls, title, body):
"""Construct the text of a category section from its title and body."""
raise NotImplementedError
class TextChangelogFormat(ChangelogFormat):
"""The traditional Mbed TLS changelog format."""
_unreleased_version_text = '= {} x.x.x branch released xxxx-xx-xx'
@classmethod
def is_released_version(cls, title):
# Look for an incomplete release date
return not re.search(r'[0-9x]{4}-[0-9x]{2}-[0-9x]?x', title)
_top_version_re = re.compile(r'(?:\A|\n)(=[^\n]*\n+)(.*?\n)(?:=|$)',
re.DOTALL)
_name_re = re.compile(r'=\s(.*)\s[0-9x]+\.', re.DOTALL)
@classmethod
def extract_top_version(cls, changelog_file_content):
"""A version section starts with a line starting with '='."""
m = re.search(cls._top_version_re, changelog_file_content)
top_version_start = m.start(1)
top_version_end = m.end(2)
top_version_title = m.group(1)
top_version_body = m.group(2)
name = re.match(cls._name_re, top_version_title).group(1)
if cls.is_released_version(top_version_title):
top_version_end = top_version_start
top_version_title = cls._unreleased_version_text.format(name) + '\n\n'
top_version_body = ''
return (changelog_file_content[:top_version_start],
top_version_title, top_version_body,
changelog_file_content[top_version_end:])
@classmethod
def version_title_text(cls, version_title):
return re.sub(r'\n.*', version_title, re.DOTALL)
_category_title_re = re.compile(r'(^\w.*)\n+', re.MULTILINE)
@classmethod
def split_categories(cls, version_body):
"""A category title is a line with the title in column 0."""
if not version_body:
return []
title_matches = list(re.finditer(cls._category_title_re, version_body))
if not title_matches or title_matches[0].start() != 0:
# There is junk before the first category.
raise CategoryParseError(0, 'Junk found where category expected')
title_starts = [m.start(1) for m in title_matches]
body_starts = [m.end(0) for m in title_matches]
body_ends = title_starts[1:] + [len(version_body)]
bodies = [version_body[body_start:body_end].rstrip('\n') + '\n'
for (body_start, body_end) in zip(body_starts, body_ends)]
title_lines = [version_body[:pos].count('\n') for pos in title_starts]
body_lines = [version_body[:pos].count('\n') for pos in body_starts]
return [CategoryContent(title_match.group(1), title_line,
body, body_line)
for title_match, title_line, body, body_line
in zip(title_matches, title_lines, bodies, body_lines)]
@classmethod
def format_category(cls, title, body):
# `split_categories` ensures that each body ends with a newline.
# Make sure that there is additionally a blank line between categories.
if not body.endswith('\n\n'):
body += '\n'
return title + '\n' + body
class ChangeLog:
"""An Mbed TLS changelog.
A changelog file consists of some header text followed by one or
more version sections. The version sections are in reverse
chronological order. Each version section consists of a title and a body.
The body of a version section consists of zero or more category
subsections. Each category subsection consists of a title and a body.
A changelog entry file has the same format as the body of a version section.
A `ChangelogFormat` object defines the concrete syntax of the changelog.
Entry files must have the same format as the changelog file.
"""
# Only accept dotted version numbers (e.g. "3.1", not "3").
# Refuse ".x" in a version number where x is a letter: this indicates
# a version that is not yet released. Something like "3.1a" is accepted.
_version_number_re = re.compile(r'[0-9]+\.[0-9A-Za-z.]+')
_incomplete_version_number_re = re.compile(r'.*\.[A-Za-z]')
_only_url_re = re.compile(r'^\s*\w+://\S+\s*$')
_has_url_re = re.compile(r'.*://.*')
def add_categories_from_text(self, filename, line_offset,
text, allow_unknown_category):
"""Parse a version section or entry file."""
try:
categories = self.format.split_categories(text)
except CategoryParseError as e:
raise InputFormatError(filename, line_offset + e.line_offset,
e.error_message)
for category in categories:
if not allow_unknown_category and \
category.name not in self.categories:
raise InputFormatError(filename,
line_offset + category.title_line,
'Unknown category: "{}"',
category.name)
body_split = category.body.splitlines()
for line_number, line in enumerate(body_split, 1):
if not self._only_url_re.match(line) and \
len(line) > MAX_LINE_LENGTH:
long_url_msg = '. URL exceeding length limit must be alone in its line.' \
if self._has_url_re.match(line) else ""
raise InputFormatError(filename,
category.body_line + line_number,
'Line is longer than allowed: '
'Length {} (Max {}){}',
len(line), MAX_LINE_LENGTH,
long_url_msg)
self.categories[category.name] += category.body
def __init__(self, input_stream, changelog_format):
"""Create a changelog object.
Populate the changelog object from the content of the file
input_stream.
"""
self.format = changelog_format
whole_file = input_stream.read()
(self.header,
self.top_version_title, top_version_body,
self.trailer) = self.format.extract_top_version(whole_file)
# Split the top version section into categories.
self.categories = OrderedDict()
for category in STANDARD_CATEGORIES:
self.categories[category] = ''
offset = (self.header + self.top_version_title).count('\n') + 1
self.add_categories_from_text(input_stream.name, offset,
top_version_body, True)
def add_file(self, input_stream):
"""Add changelog entries from a file.
"""
self.add_categories_from_text(input_stream.name, 1,
input_stream.read(), False)
def write(self, filename):
"""Write the changelog to the specified file.
"""
with open(filename, 'w', encoding='utf-8') as out:
out.write(self.header)
out.write(self.top_version_title)
for title, body in self.categories.items():
if not body:
continue
out.write(self.format.format_category(title, body))
out.write(self.trailer)
@functools.total_ordering
class EntryFileSortKey:
"""This classes defines an ordering on changelog entry files: older < newer.
* Merged entry files are sorted according to their merge date (date of
the merge commit that brought the commit that created the file into
the target branch).
* Committed but unmerged entry files are sorted according to the date
of the commit that adds them.
* Uncommitted entry files are sorted according to their modification time.
This class assumes that the file is in a git working directory with
the target branch checked out.
"""
# Categories of files. A lower number is considered older.
MERGED = 0
COMMITTED = 1
LOCAL = 2
@staticmethod
def creation_hash(filename):
"""Return the git commit id at which the given file was created.
Return None if the file was never checked into git.
"""
hashes = subprocess.check_output(['git', 'log', '--format=%H',
'--follow',
'--', filename])
m = re.search('(.+)$', hashes.decode('ascii'))
if not m:
# The git output is empty. This means that the file was
# never checked in.
return None
# The last commit in the log is the oldest one, which is when the
# file was created.
return m.group(0)
@staticmethod
def list_merges(some_hash, target, *options):
"""List merge commits from some_hash to target.
Pass options to git to select which commits are included.
"""
text = subprocess.check_output(['git', 'rev-list',
'--merges', *options,
'..'.join([some_hash, target])])
return text.decode('ascii').rstrip('\n').split('\n')
@classmethod
def merge_hash(cls, some_hash):
"""Return the git commit id at which the given commit was merged.
Return None if the given commit was never merged.
"""
target = 'HEAD'
# List the merges from some_hash to the target in two ways.
# The ancestry list is the ones that are both descendants of
# some_hash and ancestors of the target.
ancestry = frozenset(cls.list_merges(some_hash, target,
'--ancestry-path'))
# The first_parents list only contains merges that are directly
# on the target branch. We want it in reverse order (oldest first).
first_parents = cls.list_merges(some_hash, target,
'--first-parent', '--reverse')
# Look for the oldest merge commit that's both on the direct path
# and directly on the target branch. That's the place where some_hash
# was merged on the target branch. See
# https://stackoverflow.com/questions/8475448/find-merge-commit-which-include-a-specific-commit
for commit in first_parents:
if commit in ancestry:
return commit
return None
@staticmethod
def commit_timestamp(commit_id):
"""Return the timestamp of the given commit."""
text = subprocess.check_output(['git', 'show', '-s',
'--format=%ct',
commit_id])
return datetime.datetime.utcfromtimestamp(int(text))
@staticmethod
def file_timestamp(filename):
"""Return the modification timestamp of the given file."""
mtime = os.stat(filename).st_mtime
return datetime.datetime.fromtimestamp(mtime)
def __init__(self, filename):
"""Determine position of the file in the changelog entry order.
This constructor returns an object that can be used with comparison
operators, with `sort` and `sorted`, etc. Older entries are sorted
before newer entries.
"""
self.filename = filename
creation_hash = self.creation_hash(filename)
if not creation_hash:
self.category = self.LOCAL
self.datetime = self.file_timestamp(filename)
return
merge_hash = self.merge_hash(creation_hash)
if not merge_hash:
self.category = self.COMMITTED
self.datetime = self.commit_timestamp(creation_hash)
return
self.category = self.MERGED
self.datetime = self.commit_timestamp(merge_hash)
def sort_key(self):
""""Return a concrete sort key for this entry file sort key object.
``ts1 < ts2`` is implemented as ``ts1.sort_key() < ts2.sort_key()``.
"""
return (self.category, self.datetime, self.filename)
def __eq__(self, other):
return self.sort_key() == other.sort_key()
def __lt__(self, other):
return self.sort_key() < other.sort_key()
def check_output(generated_output_file, main_input_file, merged_files):
"""Make sanity checks on the generated output.
The intent of these sanity checks is to have reasonable confidence
that no content has been lost.
The sanity check is that every line that is present in an input file
is also present in an output file. This is not perfect but good enough
for now.
"""
with open(generated_output_file, 'r', encoding='utf-8') as fd:
generated_output = set(fd)
for line in open(main_input_file, 'r', encoding='utf-8'):
if line not in generated_output:
raise LostContent('original file', line)
for merged_file in merged_files:
for line in open(merged_file, 'r', encoding='utf-8'):
if line not in generated_output:
raise LostContent(merged_file, line)
def finish_output(changelog, output_file, input_file, merged_files):
"""Write the changelog to the output file.
The input file and the list of merged files are used only for sanity
checks on the output.
"""
if os.path.exists(output_file) and not os.path.isfile(output_file):
# The output is a non-regular file (e.g. pipe). Write to it directly.
output_temp = output_file
else:
# The output is a regular file. Write to a temporary file,
# then move it into place atomically.
output_temp = output_file + '.tmp'
changelog.write(output_temp)
check_output(output_temp, input_file, merged_files)
if output_temp != output_file:
os.rename(output_temp, output_file)
def remove_merged_entries(files_to_remove):
for filename in files_to_remove:
os.remove(filename)
def list_files_to_merge(options):
"""List the entry files to merge, oldest first.
"Oldest" is defined by `EntryFileSortKey`.
Also check for required .txt extension
"""
files_to_merge = glob.glob(os.path.join(options.dir, '*'))
# Ignore 00README.md
readme = os.path.join(options.dir, "00README.md")
if readme in files_to_merge:
files_to_merge.remove(readme)
# Identify files without the required .txt extension
bad_files = [x for x in files_to_merge if not x.endswith(".txt")]
if bad_files:
raise FilePathError(bad_files)
files_to_merge.sort(key=EntryFileSortKey)
return files_to_merge
def merge_entries(options):
"""Merge changelog entries into the changelog file.
Read the changelog file from options.input.
Check that all entries have a .txt extension
Read entries to merge from the directory options.dir.
Write the new changelog to options.output.
Remove the merged entries if options.keep_entries is false.
"""
with open(options.input, 'r', encoding='utf-8') as input_file:
changelog = ChangeLog(input_file, TextChangelogFormat)
files_to_merge = list_files_to_merge(options)
if not files_to_merge:
sys.stderr.write('There are no pending changelog entries.\n')
return
for filename in files_to_merge:
with open(filename, 'r', encoding='utf-8') as input_file:
changelog.add_file(input_file)
finish_output(changelog, options.output, options.input, files_to_merge)
if not options.keep_entries:
remove_merged_entries(files_to_merge)
def show_file_timestamps(options):
"""List the files to merge and their timestamp.
This is only intended for debugging purposes.
"""
files = list_files_to_merge(options)
for filename in files:
ts = EntryFileSortKey(filename)
print(ts.category, ts.datetime, filename)
def set_defaults(options):
"""Add default values for missing options."""
output_file = getattr(options, 'output', None)
if output_file is None:
options.output = options.input
if getattr(options, 'keep_entries', None) is None:
options.keep_entries = (output_file is not None)
def main():
"""Command line entry point."""
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('--dir', '-d', metavar='DIR',
default='ChangeLog.d',
help='Directory to read entries from'
' (default: ChangeLog.d)')
parser.add_argument('--input', '-i', metavar='FILE',
default='ChangeLog',
help='Existing changelog file to read from and augment'
' (default: ChangeLog)')
parser.add_argument('--keep-entries',
action='store_true', dest='keep_entries', default=None,
help='Keep the files containing entries'
' (default: remove them if --output/-o is not specified)')
parser.add_argument('--no-keep-entries',
action='store_false', dest='keep_entries',
help='Remove the files containing entries after they are merged'
' (default: remove them if --output/-o is not specified)')
parser.add_argument('--output', '-o', metavar='FILE',
help='Output changelog file'
' (default: overwrite the input)')
parser.add_argument('--list-files-only',
action='store_true',
help=('Only list the files that would be processed '
'(with some debugging information)'))
options = parser.parse_args()
set_defaults(options)
if options.list_files_only:
show_file_timestamps(options)
return
merge_entries(options)
if __name__ == '__main__':
main()

View file

@ -0,0 +1,5 @@
# Python modules required to build Mbed TLS in ordinary conditions.
# Required to (re-)generate source files. Not needed if the generated source
# files are already present and up-to-date.
-r driver.requirements.txt

148
externals/mbedtls/scripts/bump_version.sh vendored Executable file
View file

@ -0,0 +1,148 @@
#!/bin/bash
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
# Purpose
#
# Sets the version numbers in the source code to those given.
#
# Usage: bump_version.sh [ --version <version> ] [ --so-crypto <version>]
# [ --so-x509 <version> ] [ --so-tls <version> ]
# [ -v | --verbose ] [ -h | --help ]
#
set -e
VERSION=""
SOVERSION=""
# Parse arguments
#
until [ -z "$1" ]
do
case "$1" in
--version)
# Version to use
shift
VERSION=$1
;;
--so-crypto)
shift
SO_CRYPTO=$1
;;
--so-x509)
shift
SO_X509=$1
;;
--so-tls)
shift
SO_TLS=$1
;;
-v|--verbose)
# Be verbose
VERBOSE="1"
;;
-h|--help)
# print help
echo "Usage: $0"
echo -e " -h|--help\t\tPrint this help."
echo -e " --version <version>\tVersion to bump to."
echo -e " --so-crypto <version>\tSO version to bump libmbedcrypto to."
echo -e " --so-x509 <version>\tSO version to bump libmbedx509 to."
echo -e " --so-tls <version>\tSO version to bump libmbedtls to."
echo -e " -v|--verbose\t\tVerbose."
exit 1
;;
*)
# print error
echo "Unknown argument: '$1'"
exit 1
;;
esac
shift
done
if [ "X" = "X$VERSION" ];
then
echo "No version specified. Unable to continue."
exit 1
fi
[ $VERBOSE ] && echo "Bumping VERSION in CMakeLists.txt"
sed -e "s/ VERSION [0-9.]\{1,\}/ VERSION $VERSION/g" < CMakeLists.txt > tmp
mv tmp CMakeLists.txt
[ $VERBOSE ] && echo "Bumping VERSION in library/CMakeLists.txt"
sed -e "s/ VERSION [0-9.]\{1,\}/ VERSION $VERSION/g" < library/CMakeLists.txt > tmp
mv tmp library/CMakeLists.txt
if [ "X" != "X$SO_CRYPTO" ];
then
[ $VERBOSE ] && echo "Bumping SOVERSION for libmbedcrypto in library/CMakeLists.txt"
sed -e "/mbedcrypto/ s/ SOVERSION [0-9]\{1,\}/ SOVERSION $SO_CRYPTO/g" < library/CMakeLists.txt > tmp
mv tmp library/CMakeLists.txt
[ $VERBOSE ] && echo "Bumping SOVERSION for libmbedcrypto in library/Makefile"
sed -e "s/SOEXT_CRYPTO?=so.[0-9]\{1,\}/SOEXT_CRYPTO?=so.$SO_CRYPTO/g" < library/Makefile > tmp
mv tmp library/Makefile
fi
if [ "X" != "X$SO_X509" ];
then
[ $VERBOSE ] && echo "Bumping SOVERSION for libmbedx509 in library/CMakeLists.txt"
sed -e "/mbedx509/ s/ SOVERSION [0-9]\{1,\}/ SOVERSION $SO_X509/g" < library/CMakeLists.txt > tmp
mv tmp library/CMakeLists.txt
[ $VERBOSE ] && echo "Bumping SOVERSION for libmbedx509 in library/Makefile"
sed -e "s/SOEXT_X509?=so.[0-9]\{1,\}/SOEXT_X509?=so.$SO_X509/g" < library/Makefile > tmp
mv tmp library/Makefile
fi
if [ "X" != "X$SO_TLS" ];
then
[ $VERBOSE ] && echo "Bumping SOVERSION for libmbedtls in library/CMakeLists.txt"
sed -e "/mbedtls/ s/ SOVERSION [0-9]\{1,\}/ SOVERSION $SO_TLS/g" < library/CMakeLists.txt > tmp
mv tmp library/CMakeLists.txt
[ $VERBOSE ] && echo "Bumping SOVERSION for libmbedtls in library/Makefile"
sed -e "s/SOEXT_TLS?=so.[0-9]\{1,\}/SOEXT_TLS?=so.$SO_TLS/g" < library/Makefile > tmp
mv tmp library/Makefile
fi
[ $VERBOSE ] && echo "Bumping VERSION in include/mbedtls/build_info.h"
read MAJOR MINOR PATCH <<<$(IFS="."; echo $VERSION)
VERSION_NR="$( printf "0x%02X%02X%02X00" $MAJOR $MINOR $PATCH )"
cat include/mbedtls/build_info.h | \
sed -e "s/\(# *define *[A-Z]*_VERSION\)_MAJOR .\{1,\}/\1_MAJOR $MAJOR/" | \
sed -e "s/\(# *define *[A-Z]*_VERSION\)_MINOR .\{1,\}/\1_MINOR $MINOR/" | \
sed -e "s/\(# *define *[A-Z]*_VERSION\)_PATCH .\{1,\}/\1_PATCH $PATCH/" | \
sed -e "s/\(# *define *[A-Z]*_VERSION\)_NUMBER .\{1,\}/\1_NUMBER $VERSION_NR/" | \
sed -e "s/\(# *define *[A-Z]*_VERSION\)_STRING .\{1,\}/\1_STRING \"$VERSION\"/" | \
sed -e "s/\(# *define *[A-Z]*_VERSION\)_STRING_FULL .\{1,\}/\1_STRING_FULL \"Mbed TLS $VERSION\"/" \
> tmp
mv tmp include/mbedtls/build_info.h
[ $VERBOSE ] && echo "Bumping version in tests/suites/test_suite_version.data"
sed -e "s/version:\".\{1,\}/version:\"$VERSION\"/g" < tests/suites/test_suite_version.data > tmp
mv tmp tests/suites/test_suite_version.data
[ $VERBOSE ] && echo "Bumping PROJECT_NAME in doxygen/mbedtls.doxyfile and doxygen/input/doc_mainpage.h"
for i in doxygen/mbedtls.doxyfile doxygen/input/doc_mainpage.h;
do
sed -e "s/\\([Mm]bed TLS v\\)[0-9][0-9.]*/\\1$VERSION/g" < $i > tmp
mv tmp $i
done
[ $VERBOSE ] && echo "Re-generating library/error.c"
scripts/generate_errors.pl
[ $VERBOSE ] && echo "Re-generating programs/test/query_config.c"
scripts/generate_query_config.pl
[ $VERBOSE ] && echo "Re-generating library/version_features.c"
scripts/generate_features.pl
[ $VERBOSE ] && echo "Re-generating visualc files"
scripts/generate_visualc_files.pl

View file

@ -0,0 +1,24 @@
# Python package requirements for Mbed TLS testing.
-r driver.requirements.txt
# Use a known version of Pylint, because new versions tend to add warnings
# that could start rejecting our code.
# 2.4.4 is the version in Ubuntu 20.04. It supports Python >=3.5.
pylint == 2.4.4
# Use the earliest version of mypy that works with our code base.
# See https://github.com/Mbed-TLS/mbedtls/pull/3953 .
mypy >= 0.780
# At the time of writing, only needed for tests/scripts/audit-validity-dates.py.
# It needs >=35.0.0 for correct operation, and that requires Python >=3.6,
# but our CI has Python 3.5. So let pip install the newest version that's
# compatible with the running Python: this way we get something good enough
# for mypy and pylint under Python 3.5, and we also get something good enough
# to run audit-validity-dates.py on Python >=3.6.
cryptography # >= 35.0.0
# For building `tests/data_files/server9-bad-saltlen.crt` and check python
# files.
asn1crypto

952
externals/mbedtls/scripts/code_size_compare.py vendored Executable file
View file

@ -0,0 +1,952 @@
#!/usr/bin/env python3
"""
This script is for comparing the size of the library files from two
different Git revisions within an Mbed TLS repository.
The results of the comparison is formatted as csv and stored at a
configurable location.
Note: must be run from Mbed TLS root.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
import argparse
import logging
import os
import re
import shutil
import subprocess
import sys
import typing
from enum import Enum
from mbedtls_dev import build_tree
from mbedtls_dev import logging_util
from mbedtls_dev import typing_util
class SupportedArch(Enum):
"""Supported architecture for code size measurement."""
AARCH64 = 'aarch64'
AARCH32 = 'aarch32'
ARMV8_M = 'armv8-m'
X86_64 = 'x86_64'
X86 = 'x86'
class SupportedConfig(Enum):
"""Supported configuration for code size measurement."""
DEFAULT = 'default'
TFM_MEDIUM = 'tfm-medium'
# Static library
MBEDTLS_STATIC_LIB = {
'CRYPTO': 'library/libmbedcrypto.a',
'X509': 'library/libmbedx509.a',
'TLS': 'library/libmbedtls.a',
}
class CodeSizeDistinctInfo: # pylint: disable=too-few-public-methods
"""Data structure to store possibly distinct information for code size
comparison."""
def __init__( #pylint: disable=too-many-arguments
self,
version: str,
git_rev: str,
arch: str,
config: str,
compiler: str,
opt_level: str,
) -> None:
"""
:param: version: which version to compare with for code size.
:param: git_rev: Git revision to calculate code size.
:param: arch: architecture to measure code size on.
:param: config: Configuration type to calculate code size.
(See SupportedConfig)
:param: compiler: compiler used to build library/*.o.
:param: opt_level: Options that control optimization. (E.g. -Os)
"""
self.version = version
self.git_rev = git_rev
self.arch = arch
self.config = config
self.compiler = compiler
self.opt_level = opt_level
# Note: Variables below are not initialized by class instantiation.
self.pre_make_cmd = [] #type: typing.List[str]
self.make_cmd = ''
def get_info_indication(self):
"""Return a unique string to indicate Code Size Distinct Information."""
return '{git_rev}-{arch}-{config}-{compiler}'.format(**self.__dict__)
class CodeSizeCommonInfo: # pylint: disable=too-few-public-methods
"""Data structure to store common information for code size comparison."""
def __init__(
self,
host_arch: str,
measure_cmd: str,
) -> None:
"""
:param host_arch: host architecture.
:param measure_cmd: command to measure code size for library/*.o.
"""
self.host_arch = host_arch
self.measure_cmd = measure_cmd
def get_info_indication(self):
"""Return a unique string to indicate Code Size Common Information."""
return '{measure_tool}'\
.format(measure_tool=self.measure_cmd.strip().split(' ')[0])
class CodeSizeResultInfo: # pylint: disable=too-few-public-methods
"""Data structure to store result options for code size comparison."""
def __init__( #pylint: disable=too-many-arguments
self,
record_dir: str,
comp_dir: str,
with_markdown=False,
stdout=False,
show_all=False,
) -> None:
"""
:param record_dir: directory to store code size record.
:param comp_dir: directory to store results of code size comparision.
:param with_markdown: write comparision result into a markdown table.
(Default: False)
:param stdout: direct comparison result into sys.stdout.
(Default False)
:param show_all: show all objects in comparison result. (Default False)
"""
self.record_dir = record_dir
self.comp_dir = comp_dir
self.with_markdown = with_markdown
self.stdout = stdout
self.show_all = show_all
DETECT_ARCH_CMD = "cc -dM -E - < /dev/null"
def detect_arch() -> str:
"""Auto-detect host architecture."""
cc_output = subprocess.check_output(DETECT_ARCH_CMD, shell=True).decode()
if '__aarch64__' in cc_output:
return SupportedArch.AARCH64.value
if '__arm__' in cc_output:
return SupportedArch.AARCH32.value
if '__x86_64__' in cc_output:
return SupportedArch.X86_64.value
if '__i386__' in cc_output:
return SupportedArch.X86.value
else:
print("Unknown host architecture, cannot auto-detect arch.")
sys.exit(1)
TFM_MEDIUM_CONFIG_H = 'configs/ext/tfm_mbedcrypto_config_profile_medium.h'
TFM_MEDIUM_CRYPTO_CONFIG_H = 'configs/ext/crypto_config_profile_medium.h'
CONFIG_H = 'include/mbedtls/mbedtls_config.h'
CRYPTO_CONFIG_H = 'include/psa/crypto_config.h'
BACKUP_SUFFIX = '.code_size.bak'
class CodeSizeBuildInfo: # pylint: disable=too-few-public-methods
"""Gather information used to measure code size.
It collects information about architecture, configuration in order to
infer build command for code size measurement.
"""
SupportedArchConfig = [
'-a ' + SupportedArch.AARCH64.value + ' -c ' + SupportedConfig.DEFAULT.value,
'-a ' + SupportedArch.AARCH32.value + ' -c ' + SupportedConfig.DEFAULT.value,
'-a ' + SupportedArch.X86_64.value + ' -c ' + SupportedConfig.DEFAULT.value,
'-a ' + SupportedArch.X86.value + ' -c ' + SupportedConfig.DEFAULT.value,
'-a ' + SupportedArch.ARMV8_M.value + ' -c ' + SupportedConfig.TFM_MEDIUM.value,
]
def __init__(
self,
size_dist_info: CodeSizeDistinctInfo,
host_arch: str,
logger: logging.Logger,
) -> None:
"""
:param size_dist_info:
CodeSizeDistinctInfo containing info for code size measurement.
- size_dist_info.arch: architecture to measure code size on.
- size_dist_info.config: configuration type to measure
code size with.
- size_dist_info.compiler: compiler used to build library/*.o.
- size_dist_info.opt_level: Options that control optimization.
(E.g. -Os)
:param host_arch: host architecture.
:param logger: logging module
"""
self.arch = size_dist_info.arch
self.config = size_dist_info.config
self.compiler = size_dist_info.compiler
self.opt_level = size_dist_info.opt_level
self.make_cmd = ['make', '-j', 'lib']
self.host_arch = host_arch
self.logger = logger
def check_correctness(self) -> bool:
"""Check whether we are using proper / supported combination
of information to build library/*.o."""
# default config
if self.config == SupportedConfig.DEFAULT.value and \
self.arch == self.host_arch:
return True
# TF-M
elif self.arch == SupportedArch.ARMV8_M.value and \
self.config == SupportedConfig.TFM_MEDIUM.value:
return True
return False
def infer_pre_make_command(self) -> typing.List[str]:
"""Infer command to set up proper configuration before running make."""
pre_make_cmd = [] #type: typing.List[str]
if self.config == SupportedConfig.TFM_MEDIUM.value:
pre_make_cmd.append('cp {src} {dest}'
.format(src=TFM_MEDIUM_CONFIG_H, dest=CONFIG_H))
pre_make_cmd.append('cp {src} {dest}'
.format(src=TFM_MEDIUM_CRYPTO_CONFIG_H,
dest=CRYPTO_CONFIG_H))
return pre_make_cmd
def infer_make_cflags(self) -> str:
"""Infer CFLAGS by instance attributes in CodeSizeDistinctInfo."""
cflags = [] #type: typing.List[str]
# set optimization level
cflags.append(self.opt_level)
# set compiler by config
if self.config == SupportedConfig.TFM_MEDIUM.value:
self.compiler = 'armclang'
cflags.append('-mcpu=cortex-m33')
# set target
if self.compiler == 'armclang':
cflags.append('--target=arm-arm-none-eabi')
return ' '.join(cflags)
def infer_make_command(self) -> str:
"""Infer make command by CFLAGS and CC."""
if self.check_correctness():
# set CFLAGS=
self.make_cmd.append('CFLAGS=\'{}\''.format(self.infer_make_cflags()))
# set CC=
self.make_cmd.append('CC={}'.format(self.compiler))
return ' '.join(self.make_cmd)
else:
self.logger.error("Unsupported combination of architecture: {} " \
"and configuration: {}.\n"
.format(self.arch,
self.config))
self.logger.error("Please use supported combination of " \
"architecture and configuration:")
for comb in CodeSizeBuildInfo.SupportedArchConfig:
self.logger.error(comb)
self.logger.error("")
self.logger.error("For your system, please use:")
for comb in CodeSizeBuildInfo.SupportedArchConfig:
if "default" in comb and self.host_arch not in comb:
continue
self.logger.error(comb)
sys.exit(1)
class CodeSizeCalculator:
""" A calculator to calculate code size of library/*.o based on
Git revision and code size measurement tool.
"""
def __init__( #pylint: disable=too-many-arguments
self,
git_rev: str,
pre_make_cmd: typing.List[str],
make_cmd: str,
measure_cmd: str,
logger: logging.Logger,
) -> None:
"""
:param git_rev: Git revision. (E.g: commit)
:param pre_make_cmd: command to set up proper config before running make.
:param make_cmd: command to build library/*.o.
:param measure_cmd: command to measure code size for library/*.o.
:param logger: logging module
"""
self.repo_path = "."
self.git_command = "git"
self.make_clean = 'make clean'
self.git_rev = git_rev
self.pre_make_cmd = pre_make_cmd
self.make_cmd = make_cmd
self.measure_cmd = measure_cmd
self.logger = logger
@staticmethod
def validate_git_revision(git_rev: str) -> str:
result = subprocess.check_output(["git", "rev-parse", "--verify",
git_rev + "^{commit}"],
shell=False, universal_newlines=True)
return result[:7]
def _create_git_worktree(self) -> str:
"""Create a separate worktree for Git revision.
If Git revision is current, use current worktree instead."""
if self.git_rev == 'current':
self.logger.debug("Using current work directory.")
git_worktree_path = self.repo_path
else:
self.logger.debug("Creating git worktree for {}."
.format(self.git_rev))
git_worktree_path = os.path.join(self.repo_path,
"temp-" + self.git_rev)
subprocess.check_output(
[self.git_command, "worktree", "add", "--detach",
git_worktree_path, self.git_rev], cwd=self.repo_path,
stderr=subprocess.STDOUT
)
return git_worktree_path
@staticmethod
def backup_config_files(restore: bool) -> None:
"""Backup / Restore config files."""
if restore:
shutil.move(CONFIG_H + BACKUP_SUFFIX, CONFIG_H)
shutil.move(CRYPTO_CONFIG_H + BACKUP_SUFFIX, CRYPTO_CONFIG_H)
else:
shutil.copy(CONFIG_H, CONFIG_H + BACKUP_SUFFIX)
shutil.copy(CRYPTO_CONFIG_H, CRYPTO_CONFIG_H + BACKUP_SUFFIX)
def _build_libraries(self, git_worktree_path: str) -> None:
"""Build library/*.o in the specified worktree."""
self.logger.debug("Building library/*.o for {}."
.format(self.git_rev))
my_environment = os.environ.copy()
try:
if self.git_rev == 'current':
self.backup_config_files(restore=False)
for pre_cmd in self.pre_make_cmd:
subprocess.check_output(
pre_cmd, env=my_environment, shell=True,
cwd=git_worktree_path, stderr=subprocess.STDOUT,
universal_newlines=True
)
subprocess.check_output(
self.make_clean, env=my_environment, shell=True,
cwd=git_worktree_path, stderr=subprocess.STDOUT,
universal_newlines=True
)
subprocess.check_output(
self.make_cmd, env=my_environment, shell=True,
cwd=git_worktree_path, stderr=subprocess.STDOUT,
universal_newlines=True
)
if self.git_rev == 'current':
self.backup_config_files(restore=True)
except subprocess.CalledProcessError as e:
self._handle_called_process_error(e, git_worktree_path)
def _gen_raw_code_size(self, git_worktree_path: str) -> typing.Dict[str, str]:
"""Measure code size by a tool and return in UTF-8 encoding."""
self.logger.debug("Measuring code size for {} by `{}`."
.format(self.git_rev,
self.measure_cmd.strip().split(' ')[0]))
res = {}
for mod, st_lib in MBEDTLS_STATIC_LIB.items():
try:
result = subprocess.check_output(
[self.measure_cmd + ' ' + st_lib], cwd=git_worktree_path,
shell=True, universal_newlines=True
)
res[mod] = result
except subprocess.CalledProcessError as e:
self._handle_called_process_error(e, git_worktree_path)
return res
def _remove_worktree(self, git_worktree_path: str) -> None:
"""Remove temporary worktree."""
if git_worktree_path != self.repo_path:
self.logger.debug("Removing temporary worktree {}."
.format(git_worktree_path))
subprocess.check_output(
[self.git_command, "worktree", "remove", "--force",
git_worktree_path], cwd=self.repo_path,
stderr=subprocess.STDOUT
)
def _handle_called_process_error(self, e: subprocess.CalledProcessError,
git_worktree_path: str) -> None:
"""Handle a CalledProcessError and quit the program gracefully.
Remove any extra worktrees so that the script may be called again."""
# Tell the user what went wrong
self.logger.error(e, exc_info=True)
self.logger.error("Process output:\n {}".format(e.output))
# Quit gracefully by removing the existing worktree
self._remove_worktree(git_worktree_path)
sys.exit(-1)
def cal_libraries_code_size(self) -> typing.Dict[str, str]:
"""Do a complete round to calculate code size of library/*.o
by measurement tool.
:return A dictionary of measured code size
- typing.Dict[mod: str]
"""
git_worktree_path = self._create_git_worktree()
try:
self._build_libraries(git_worktree_path)
res = self._gen_raw_code_size(git_worktree_path)
finally:
self._remove_worktree(git_worktree_path)
return res
class CodeSizeGenerator:
""" A generator based on size measurement tool for library/*.o.
This is an abstract class. To use it, derive a class that implements
write_record and write_comparison methods, then call both of them with
proper arguments.
"""
def __init__(self, logger: logging.Logger) -> None:
"""
:param logger: logging module
"""
self.logger = logger
def write_record(
self,
git_rev: str,
code_size_text: typing.Dict[str, str],
output: typing_util.Writable
) -> None:
"""Write size record into a file.
:param git_rev: Git revision. (E.g: commit)
:param code_size_text:
string output (utf-8) from measurement tool of code size.
- typing.Dict[mod: str]
:param output: output stream which the code size record is written to.
(Note: Normally write code size record into File)
"""
raise NotImplementedError
def write_comparison( #pylint: disable=too-many-arguments
self,
old_rev: str,
new_rev: str,
output: typing_util.Writable,
with_markdown=False,
show_all=False
) -> None:
"""Write a comparision result into a stream between two Git revisions.
:param old_rev: old Git revision to compared with.
:param new_rev: new Git revision to compared with.
:param output: output stream which the code size record is written to.
(File / sys.stdout)
:param with_markdown: write comparision result in a markdown table.
(Default: False)
:param show_all: show all objects in comparison result. (Default False)
"""
raise NotImplementedError
class CodeSizeGeneratorWithSize(CodeSizeGenerator):
"""Code Size Base Class for size record saving and writing."""
class SizeEntry: # pylint: disable=too-few-public-methods
"""Data Structure to only store information of code size."""
def __init__(self, text: int, data: int, bss: int, dec: int):
self.text = text
self.data = data
self.bss = bss
self.total = dec # total <=> dec
def __init__(self, logger: logging.Logger) -> None:
""" Variable code_size is used to store size info for any Git revisions.
:param code_size:
Data Format as following:
code_size = {
git_rev: {
module: {
file_name: SizeEntry,
...
},
...
},
...
}
"""
super().__init__(logger)
self.code_size = {} #type: typing.Dict[str, typing.Dict]
self.mod_total_suffix = '-' + 'TOTALS'
def _set_size_record(self, git_rev: str, mod: str, size_text: str) -> None:
"""Store size information for target Git revision and high-level module.
size_text Format: text data bss dec hex filename
"""
size_record = {}
for line in size_text.splitlines()[1:]:
data = line.split()
if re.match(r'\s*\(TOTALS\)', data[5]):
data[5] = mod + self.mod_total_suffix
# file_name: SizeEntry(text, data, bss, dec)
size_record[data[5]] = CodeSizeGeneratorWithSize.SizeEntry(
int(data[0]), int(data[1]), int(data[2]), int(data[3]))
self.code_size.setdefault(git_rev, {}).update({mod: size_record})
def read_size_record(self, git_rev: str, fname: str) -> None:
"""Read size information from csv file and write it into code_size.
fname Format: filename text data bss dec
"""
mod = ""
size_record = {}
with open(fname, 'r') as csv_file:
for line in csv_file:
data = line.strip().split()
# check if we find the beginning of a module
if data and data[0] in MBEDTLS_STATIC_LIB:
mod = data[0]
continue
if mod:
# file_name: SizeEntry(text, data, bss, dec)
size_record[data[0]] = CodeSizeGeneratorWithSize.SizeEntry(
int(data[1]), int(data[2]), int(data[3]), int(data[4]))
# check if we hit record for the end of a module
m = re.match(r'\w+' + self.mod_total_suffix, line)
if m:
if git_rev in self.code_size:
self.code_size[git_rev].update({mod: size_record})
else:
self.code_size[git_rev] = {mod: size_record}
mod = ""
size_record = {}
def write_record(
self,
git_rev: str,
code_size_text: typing.Dict[str, str],
output: typing_util.Writable
) -> None:
"""Write size information to a file.
Writing Format: filename text data bss total(dec)
"""
for mod, size_text in code_size_text.items():
self._set_size_record(git_rev, mod, size_text)
format_string = "{:<30} {:>7} {:>7} {:>7} {:>7}\n"
output.write(format_string.format("filename",
"text", "data", "bss", "total"))
for mod, f_size in self.code_size[git_rev].items():
output.write("\n" + mod + "\n")
for fname, size_entry in f_size.items():
output.write(format_string
.format(fname,
size_entry.text, size_entry.data,
size_entry.bss, size_entry.total))
def write_comparison( #pylint: disable=too-many-arguments
self,
old_rev: str,
new_rev: str,
output: typing_util.Writable,
with_markdown=False,
show_all=False
) -> None:
# pylint: disable=too-many-locals
"""Write comparison result into a file.
Writing Format:
Markdown Output:
filename new(text) new(data) change(text) change(data)
CSV Output:
filename new(text) new(data) old(text) old(data) change(text) change(data)
"""
header_line = ["filename", "new(text)", "old(text)", "change(text)",
"new(data)", "old(data)", "change(data)"]
if with_markdown:
dash_line = [":----", "----:", "----:", "----:",
"----:", "----:", "----:"]
# | filename | new(text) | new(data) | change(text) | change(data) |
line_format = "| {0:<30} | {1:>9} | {4:>9} | {3:>12} | {6:>12} |\n"
bold_text = lambda x: '**' + str(x) + '**'
else:
# filename new(text) new(data) old(text) old(data) change(text) change(data)
line_format = "{0:<30} {1:>9} {4:>9} {2:>10} {5:>10} {3:>12} {6:>12}\n"
def cal_sect_change(
old_size: typing.Optional[CodeSizeGeneratorWithSize.SizeEntry],
new_size: typing.Optional[CodeSizeGeneratorWithSize.SizeEntry],
sect: str
) -> typing.List:
"""Inner helper function to calculate size change for a section.
Convention for special cases:
- If the object has been removed in new Git revision,
the size is minus code size of old Git revision;
the size change is marked as `Removed`,
- If the object only exists in new Git revision,
the size is code size of new Git revision;
the size change is marked as `None`,
:param: old_size: code size for objects in old Git revision.
:param: new_size: code size for objects in new Git revision.
:param: sect: section to calculate from `size` tool. This could be
any instance variable in SizeEntry.
:return: List of [section size of objects for new Git revision,
section size of objects for old Git revision,
section size change of objects between two Git revisions]
"""
if old_size and new_size:
new_attr = new_size.__dict__[sect]
old_attr = old_size.__dict__[sect]
delta = new_attr - old_attr
change_attr = '{0:{1}}'.format(delta, '+' if delta else '')
elif old_size:
new_attr = 'Removed'
old_attr = old_size.__dict__[sect]
delta = - old_attr
change_attr = '{0:{1}}'.format(delta, '+' if delta else '')
elif new_size:
new_attr = new_size.__dict__[sect]
old_attr = 'NotCreated'
delta = new_attr
change_attr = '{0:{1}}'.format(delta, '+' if delta else '')
else:
# Should never happen
new_attr = 'Error'
old_attr = 'Error'
change_attr = 'Error'
return [new_attr, old_attr, change_attr]
# sort dictionary by key
sort_by_k = lambda item: item[0].lower()
def get_results(
f_rev_size:
typing.Dict[str,
typing.Dict[str,
CodeSizeGeneratorWithSize.SizeEntry]]
) -> typing.List:
"""Return List of results in the format of:
[filename, new(text), old(text), change(text),
new(data), old(data), change(data)]
"""
res = []
for fname, revs_size in sorted(f_rev_size.items(), key=sort_by_k):
old_size = revs_size.get(old_rev)
new_size = revs_size.get(new_rev)
text_sect = cal_sect_change(old_size, new_size, 'text')
data_sect = cal_sect_change(old_size, new_size, 'data')
# skip the files that haven't changed in code size
if not show_all and text_sect[-1] == '0' and data_sect[-1] == '0':
continue
res.append([fname, *text_sect, *data_sect])
return res
# write header
output.write(line_format.format(*header_line))
if with_markdown:
output.write(line_format.format(*dash_line))
for mod in MBEDTLS_STATIC_LIB:
# convert self.code_size to:
# {
# file_name: {
# old_rev: SizeEntry,
# new_rev: SizeEntry
# },
# ...
# }
f_rev_size = {} #type: typing.Dict[str, typing.Dict]
for fname, size_entry in self.code_size[old_rev][mod].items():
f_rev_size.setdefault(fname, {}).update({old_rev: size_entry})
for fname, size_entry in self.code_size[new_rev][mod].items():
f_rev_size.setdefault(fname, {}).update({new_rev: size_entry})
mod_total_sz = f_rev_size.pop(mod + self.mod_total_suffix)
res = get_results(f_rev_size)
total_clm = get_results({mod + self.mod_total_suffix: mod_total_sz})
if with_markdown:
# bold row of mod-TOTALS in markdown table
total_clm = [[bold_text(j) for j in i] for i in total_clm]
res += total_clm
# write comparison result
for line in res:
output.write(line_format.format(*line))
class CodeSizeComparison:
"""Compare code size between two Git revisions."""
def __init__( #pylint: disable=too-many-arguments
self,
old_size_dist_info: CodeSizeDistinctInfo,
new_size_dist_info: CodeSizeDistinctInfo,
size_common_info: CodeSizeCommonInfo,
result_options: CodeSizeResultInfo,
logger: logging.Logger,
) -> None:
"""
:param old_size_dist_info: CodeSizeDistinctInfo containing old distinct
info to compare code size with.
:param new_size_dist_info: CodeSizeDistinctInfo containing new distinct
info to take as comparision base.
:param size_common_info: CodeSizeCommonInfo containing common info for
both old and new size distinct info and
measurement tool.
:param result_options: CodeSizeResultInfo containing results options for
code size record and comparision.
:param logger: logging module
"""
self.logger = logger
self.old_size_dist_info = old_size_dist_info
self.new_size_dist_info = new_size_dist_info
self.size_common_info = size_common_info
# infer pre make command
self.old_size_dist_info.pre_make_cmd = CodeSizeBuildInfo(
self.old_size_dist_info, self.size_common_info.host_arch,
self.logger).infer_pre_make_command()
self.new_size_dist_info.pre_make_cmd = CodeSizeBuildInfo(
self.new_size_dist_info, self.size_common_info.host_arch,
self.logger).infer_pre_make_command()
# infer make command
self.old_size_dist_info.make_cmd = CodeSizeBuildInfo(
self.old_size_dist_info, self.size_common_info.host_arch,
self.logger).infer_make_command()
self.new_size_dist_info.make_cmd = CodeSizeBuildInfo(
self.new_size_dist_info, self.size_common_info.host_arch,
self.logger).infer_make_command()
# initialize size parser with corresponding measurement tool
self.code_size_generator = self.__generate_size_parser()
self.result_options = result_options
self.csv_dir = os.path.abspath(self.result_options.record_dir)
os.makedirs(self.csv_dir, exist_ok=True)
self.comp_dir = os.path.abspath(self.result_options.comp_dir)
os.makedirs(self.comp_dir, exist_ok=True)
def __generate_size_parser(self):
"""Generate a parser for the corresponding measurement tool."""
if re.match(r'size', self.size_common_info.measure_cmd.strip()):
return CodeSizeGeneratorWithSize(self.logger)
else:
self.logger.error("Unsupported measurement tool: `{}`."
.format(self.size_common_info.measure_cmd
.strip().split(' ')[0]))
sys.exit(1)
def cal_code_size(
self,
size_dist_info: CodeSizeDistinctInfo
) -> typing.Dict[str, str]:
"""Calculate code size of library/*.o in a UTF-8 encoding"""
return CodeSizeCalculator(size_dist_info.git_rev,
size_dist_info.pre_make_cmd,
size_dist_info.make_cmd,
self.size_common_info.measure_cmd,
self.logger).cal_libraries_code_size()
def gen_code_size_report(self, size_dist_info: CodeSizeDistinctInfo) -> None:
"""Generate code size record and write it into a file."""
self.logger.info("Start to generate code size record for {}."
.format(size_dist_info.git_rev))
output_file = os.path.join(
self.csv_dir,
'{}-{}.csv'
.format(size_dist_info.get_info_indication(),
self.size_common_info.get_info_indication()))
# Check if the corresponding record exists
if size_dist_info.git_rev != "current" and \
os.path.exists(output_file):
self.logger.debug("Code size csv file for {} already exists."
.format(size_dist_info.git_rev))
self.code_size_generator.read_size_record(
size_dist_info.git_rev, output_file)
else:
# measure code size
code_size_text = self.cal_code_size(size_dist_info)
self.logger.debug("Generating code size csv for {}."
.format(size_dist_info.git_rev))
output = open(output_file, "w")
self.code_size_generator.write_record(
size_dist_info.git_rev, code_size_text, output)
def gen_code_size_comparison(self) -> None:
"""Generate results of code size changes between two Git revisions,
old and new.
- Measured code size result of these two Git revisions must be available.
- The result is directed into either file / stdout depending on
the option, size_common_info.result_options.stdout. (Default: file)
"""
self.logger.info("Start to generate comparision result between "\
"{} and {}."
.format(self.old_size_dist_info.git_rev,
self.new_size_dist_info.git_rev))
if self.result_options.stdout:
output = sys.stdout
else:
output_file = os.path.join(
self.comp_dir,
'{}-{}-{}.{}'
.format(self.old_size_dist_info.get_info_indication(),
self.new_size_dist_info.get_info_indication(),
self.size_common_info.get_info_indication(),
'md' if self.result_options.with_markdown else 'csv'))
output = open(output_file, "w")
self.logger.debug("Generating comparison results between {} and {}."
.format(self.old_size_dist_info.git_rev,
self.new_size_dist_info.git_rev))
if self.result_options.with_markdown or self.result_options.stdout:
print("Measure code size between {} and {} by `{}`."
.format(self.old_size_dist_info.get_info_indication(),
self.new_size_dist_info.get_info_indication(),
self.size_common_info.get_info_indication()),
file=output)
self.code_size_generator.write_comparison(
self.old_size_dist_info.git_rev,
self.new_size_dist_info.git_rev,
output, self.result_options.with_markdown,
self.result_options.show_all)
def get_comparision_results(self) -> None:
"""Compare size of library/*.o between self.old_size_dist_info and
self.old_size_dist_info and generate the result file."""
build_tree.check_repo_path()
self.gen_code_size_report(self.old_size_dist_info)
self.gen_code_size_report(self.new_size_dist_info)
self.gen_code_size_comparison()
def main():
parser = argparse.ArgumentParser(description=(__doc__))
group_required = parser.add_argument_group(
'required arguments',
'required arguments to parse for running ' + os.path.basename(__file__))
group_required.add_argument(
'-o', '--old-rev', type=str, required=True,
help='old Git revision for comparison.')
group_optional = parser.add_argument_group(
'optional arguments',
'optional arguments to parse for running ' + os.path.basename(__file__))
group_optional.add_argument(
'--record-dir', type=str, default='code_size_records',
help='directory where code size record is stored. '
'(Default: code_size_records)')
group_optional.add_argument(
'--comp-dir', type=str, default='comparison',
help='directory where comparison result is stored. '
'(Default: comparison)')
group_optional.add_argument(
'-n', '--new-rev', type=str, default='current',
help='new Git revision as comparison base. '
'(Default is the current work directory, including uncommitted '
'changes.)')
group_optional.add_argument(
'-a', '--arch', type=str, default=detect_arch(),
choices=list(map(lambda s: s.value, SupportedArch)),
help='Specify architecture for code size comparison. '
'(Default is the host architecture.)')
group_optional.add_argument(
'-c', '--config', type=str, default=SupportedConfig.DEFAULT.value,
choices=list(map(lambda s: s.value, SupportedConfig)),
help='Specify configuration type for code size comparison. '
'(Default is the current Mbed TLS configuration.)')
group_optional.add_argument(
'--markdown', action='store_true', dest='markdown',
help='Show comparision of code size in a markdown table. '
'(Only show the files that have changed).')
group_optional.add_argument(
'--stdout', action='store_true', dest='stdout',
help='Set this option to direct comparison result into sys.stdout. '
'(Default: file)')
group_optional.add_argument(
'--show-all', action='store_true', dest='show_all',
help='Show all the objects in comparison result, including the ones '
'that haven\'t changed in code size. (Default: False)')
group_optional.add_argument(
'--verbose', action='store_true', dest='verbose',
help='Show logs in detail for code size measurement. '
'(Default: False)')
comp_args = parser.parse_args()
logger = logging.getLogger()
logging_util.configure_logger(logger, split_level=logging.NOTSET)
logger.setLevel(logging.DEBUG if comp_args.verbose else logging.INFO)
if os.path.isfile(comp_args.record_dir):
logger.error("record directory: {} is not a directory"
.format(comp_args.record_dir))
sys.exit(1)
if os.path.isfile(comp_args.comp_dir):
logger.error("comparison directory: {} is not a directory"
.format(comp_args.comp_dir))
sys.exit(1)
comp_args.old_rev = CodeSizeCalculator.validate_git_revision(
comp_args.old_rev)
if comp_args.new_rev != 'current':
comp_args.new_rev = CodeSizeCalculator.validate_git_revision(
comp_args.new_rev)
# version, git_rev, arch, config, compiler, opt_level
old_size_dist_info = CodeSizeDistinctInfo(
'old', comp_args.old_rev, comp_args.arch, comp_args.config, 'cc', '-Os')
new_size_dist_info = CodeSizeDistinctInfo(
'new', comp_args.new_rev, comp_args.arch, comp_args.config, 'cc', '-Os')
# host_arch, measure_cmd
size_common_info = CodeSizeCommonInfo(
detect_arch(), 'size -t')
# record_dir, comp_dir, with_markdown, stdout, show_all
result_options = CodeSizeResultInfo(
comp_args.record_dir, comp_args.comp_dir,
comp_args.markdown, comp_args.stdout, comp_args.show_all)
logger.info("Measure code size between {} and {} by `{}`."
.format(old_size_dist_info.get_info_indication(),
new_size_dist_info.get_info_indication(),
size_common_info.get_info_indication()))
CodeSizeComparison(old_size_dist_info, new_size_dist_info,
size_common_info, result_options,
logger).get_comparision_results()
if __name__ == "__main__":
main()

222
externals/mbedtls/scripts/code_style.py vendored Executable file
View file

@ -0,0 +1,222 @@
#!/usr/bin/env python3
"""Check or fix the code style by running Uncrustify.
This script must be run from the root of a Git work tree containing Mbed TLS.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
import argparse
import os
import re
import subprocess
import sys
from typing import FrozenSet, List, Optional
UNCRUSTIFY_SUPPORTED_VERSION = "0.75.1"
CONFIG_FILE = ".uncrustify.cfg"
UNCRUSTIFY_EXE = "uncrustify"
UNCRUSTIFY_ARGS = ["-c", CONFIG_FILE]
CHECK_GENERATED_FILES = "tests/scripts/check-generated-files.sh"
def print_err(*args):
print("Error: ", *args, file=sys.stderr)
# Print the file names that will be skipped and the help message
def print_skip(files_to_skip):
print()
print(*files_to_skip, sep=", SKIP\n", end=", SKIP\n")
print("Warning: The listed files will be skipped because\n"
"they are not known to git.")
print()
# Match FILENAME(s) in "check SCRIPT (FILENAME...)"
CHECK_CALL_RE = re.compile(r"\n\s*check\s+[^\s#$&*?;|]+([^\n#$&*?;|]+)",
re.ASCII)
def list_generated_files() -> FrozenSet[str]:
"""Return the names of generated files.
We don't reformat generated files, since the result might be different
from the output of the generator. Ideally the result of the generator
would conform to the code style, but this would be difficult, especially
with respect to the placement of line breaks in long logical lines.
"""
# Parse check-generated-files.sh to get an up-to-date list of
# generated files. Read the file rather than calling it so that
# this script only depends on Git, Python and uncrustify, and not other
# tools such as sh or grep which might not be available on Windows.
# This introduces a limitation: check-generated-files.sh must have
# the expected format and must list the files explicitly, not through
# wildcards or command substitution.
content = open(CHECK_GENERATED_FILES, encoding="utf-8").read()
checks = re.findall(CHECK_CALL_RE, content)
return frozenset(word for s in checks for word in s.split())
def get_src_files(since: Optional[str]) -> List[str]:
"""
Use git to get a list of the source files.
The optional argument since is a commit, indicating to only list files
that have changed since that commit. Without this argument, list all
files known to git.
Only C files are included, and certain files (generated, or 3rdparty)
are excluded.
"""
file_patterns = ["*.[hc]",
"tests/suites/*.function",
"scripts/data_files/*.fmt"]
output = subprocess.check_output(["git", "ls-files"] + file_patterns,
universal_newlines=True)
src_files = output.split()
if since:
# get all files changed in commits since the starting point
cmd = ["git", "log", since + "..HEAD", "--name-only", "--pretty=", "--"] + src_files
output = subprocess.check_output(cmd, universal_newlines=True)
committed_changed_files = output.split()
# and also get all files with uncommitted changes
cmd = ["git", "diff", "--name-only", "--"] + src_files
output = subprocess.check_output(cmd, universal_newlines=True)
uncommitted_changed_files = output.split()
src_files = list(set(committed_changed_files + uncommitted_changed_files))
generated_files = list_generated_files()
# Don't correct style for third-party files (and, for simplicity,
# companion files in the same subtree), or for automatically
# generated files (we're correcting the templates instead).
src_files = [filename for filename in src_files
if not (filename.startswith("3rdparty/") or
filename in generated_files)]
return src_files
def get_uncrustify_version() -> str:
"""
Get the version string from Uncrustify
"""
result = subprocess.run([UNCRUSTIFY_EXE, "--version"],
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
check=False)
if result.returncode != 0:
print_err("Could not get Uncrustify version:", str(result.stderr, "utf-8"))
return ""
else:
return str(result.stdout, "utf-8")
def check_style_is_correct(src_file_list: List[str]) -> bool:
"""
Check the code style and output a diff for each file whose style is
incorrect.
"""
style_correct = True
for src_file in src_file_list:
uncrustify_cmd = [UNCRUSTIFY_EXE] + UNCRUSTIFY_ARGS + [src_file]
result = subprocess.run(uncrustify_cmd, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, check=False)
if result.returncode != 0:
print_err("Uncrustify returned " + str(result.returncode) +
" correcting file " + src_file)
return False
# Uncrustify makes changes to the code and places the result in a new
# file with the extension ".uncrustify". To get the changes (if any)
# simply diff the 2 files.
diff_cmd = ["diff", "-u", src_file, src_file + ".uncrustify"]
cp = subprocess.run(diff_cmd, check=False)
if cp.returncode == 1:
print(src_file + " changed - code style is incorrect.")
style_correct = False
elif cp.returncode != 0:
raise subprocess.CalledProcessError(cp.returncode, cp.args,
cp.stdout, cp.stderr)
# Tidy up artifact
os.remove(src_file + ".uncrustify")
return style_correct
def fix_style_single_pass(src_file_list: List[str]) -> bool:
"""
Run Uncrustify once over the source files.
"""
code_change_args = UNCRUSTIFY_ARGS + ["--no-backup"]
for src_file in src_file_list:
uncrustify_cmd = [UNCRUSTIFY_EXE] + code_change_args + [src_file]
result = subprocess.run(uncrustify_cmd, check=False)
if result.returncode != 0:
print_err("Uncrustify with file returned: " +
str(result.returncode) + " correcting file " +
src_file)
return False
return True
def fix_style(src_file_list: List[str]) -> int:
"""
Fix the code style. This takes 2 passes of Uncrustify.
"""
if not fix_style_single_pass(src_file_list):
return 1
if not fix_style_single_pass(src_file_list):
return 1
# Guard against future changes that cause the codebase to require
# more passes.
if not check_style_is_correct(src_file_list):
print_err("Code style still incorrect after second run of Uncrustify.")
return 1
else:
return 0
def main() -> int:
"""
Main with command line arguments.
"""
uncrustify_version = get_uncrustify_version().strip()
if UNCRUSTIFY_SUPPORTED_VERSION not in uncrustify_version:
print("Warning: Using unsupported Uncrustify version '" +
uncrustify_version + "'")
print("Note: The only supported version is " +
UNCRUSTIFY_SUPPORTED_VERSION)
parser = argparse.ArgumentParser()
parser.add_argument('-f', '--fix', action='store_true',
help=('modify source files to fix the code style '
'(default: print diff, do not modify files)'))
parser.add_argument('-s', '--since', metavar='COMMIT', const='development', nargs='?',
help=('only check files modified since the specified commit'
' (e.g. --since=HEAD~3 or --since=development). If no'
' commit is specified, default to development.'))
# --subset is almost useless: it only matters if there are no files
# ('code_style.py' without arguments checks all files known to Git,
# 'code_style.py --subset' does nothing). In particular,
# 'code_style.py --fix --subset ...' is intended as a stable ("porcelain")
# way to restyle a possibly empty set of files.
parser.add_argument('--subset', action='store_true',
help='only check the specified files (default with non-option arguments)')
parser.add_argument('operands', nargs='*', metavar='FILE',
help='files to check (files MUST be known to git, if none: check all)')
args = parser.parse_args()
covered = frozenset(get_src_files(args.since))
# We only check files that are known to git
if args.subset or args.operands:
src_files = [f for f in args.operands if f in covered]
skip_src_files = [f for f in args.operands if f not in covered]
if skip_src_files:
print_skip(skip_src_files)
else:
src_files = list(covered)
if args.fix:
# Fix mode
return fix_style(src_files)
else:
# Check mode
if check_style_is_correct(src_files):
print("Checked {} files, style ok.".format(len(src_files)))
return 0
else:
return 1
if __name__ == '__main__':
sys.exit(main())

123
externals/mbedtls/scripts/common.make vendored Normal file
View file

@ -0,0 +1,123 @@
# To compile on SunOS: add "-lsocket -lnsl" to LDFLAGS
ifndef MBEDTLS_PATH
MBEDTLS_PATH := ..
endif
CFLAGS ?= -O2
WARNING_CFLAGS ?= -Wall -Wextra -Wformat=2 -Wno-format-nonliteral
WARNING_CXXFLAGS ?= -Wall -Wextra -Wformat=2 -Wno-format-nonliteral
LDFLAGS ?=
LOCAL_CFLAGS = $(WARNING_CFLAGS) -I$(MBEDTLS_TEST_PATH)/include -I$(MBEDTLS_PATH)/include -D_FILE_OFFSET_BITS=64
LOCAL_CXXFLAGS = $(WARNING_CXXFLAGS) -I$(MBEDTLS_PATH)/include -I$(MBEDTLS_PATH)/tests/include -D_FILE_OFFSET_BITS=64
LOCAL_LDFLAGS = ${MBEDTLS_TEST_OBJS} \
-L$(MBEDTLS_PATH)/library \
-lmbedtls$(SHARED_SUFFIX) \
-lmbedx509$(SHARED_SUFFIX) \
-lmbedcrypto$(SHARED_SUFFIX)
include $(MBEDTLS_PATH)/3rdparty/Makefile.inc
LOCAL_CFLAGS+=$(THIRDPARTY_INCLUDES)
ifndef SHARED
MBEDLIBS=$(MBEDTLS_PATH)/library/libmbedcrypto.a $(MBEDTLS_PATH)/library/libmbedx509.a $(MBEDTLS_PATH)/library/libmbedtls.a
else
MBEDLIBS=$(MBEDTLS_PATH)/library/libmbedcrypto.$(DLEXT) $(MBEDTLS_PATH)/library/libmbedx509.$(DLEXT) $(MBEDTLS_PATH)/library/libmbedtls.$(DLEXT)
endif
ifdef DEBUG
LOCAL_CFLAGS += -g3
endif
# if we're running on Windows, build for Windows
ifdef WINDOWS
WINDOWS_BUILD=1
endif
## Usage: $(call remove_enabled_options,PREPROCESSOR_INPUT)
## Remove the preprocessor symbols that are set in the current configuration
## from PREPROCESSOR_INPUT. Also normalize whitespace.
## Example:
## $(call remove_enabled_options,MBEDTLS_FOO MBEDTLS_BAR)
## This expands to an empty string "" if MBEDTLS_FOO and MBEDTLS_BAR are both
## enabled, to "MBEDTLS_FOO" if MBEDTLS_BAR is enabled but MBEDTLS_FOO is
## disabled, etc.
##
## This only works with a Unix-like shell environment (Bourne/POSIX-style shell
## and standard commands) and a Unix-like compiler (supporting -E). In
## other environments, the output is likely to be empty.
define remove_enabled_options
$(strip $(shell
exec 2>/dev/null;
{ echo '#include <mbedtls/build_info.h>'; echo $(1); } |
$(CC) $(LOCAL_CFLAGS) $(CFLAGS) -E - |
tail -n 1
))
endef
ifdef WINDOWS_BUILD
DLEXT=dll
EXEXT=.exe
LOCAL_LDFLAGS += -lws2_32 -lbcrypt
ifdef SHARED
SHARED_SUFFIX=.$(DLEXT)
endif
else # Not building for Windows
DLEXT ?= so
EXEXT=
SHARED_SUFFIX=
ifndef THREADING
# Auto-detect configurations with pthread.
# If the call to remove_enabled_options returns "control", the symbols
# are confirmed set and we link with pthread.
# If the auto-detection fails, the result of the call is empty and
# we keep THREADING undefined.
ifeq (control,$(call remove_enabled_options,control MBEDTLS_THREADING_C MBEDTLS_THREADING_PTHREAD))
THREADING := pthread
endif
endif
ifeq ($(THREADING),pthread)
LOCAL_LDFLAGS += -lpthread
endif
endif
ifdef WINDOWS
PYTHON ?= python
else
PYTHON ?= $(shell if type python3 >/dev/null 2>/dev/null; then echo python3; else echo python; fi)
endif
# See root Makefile
GEN_FILES ?= yes
ifdef GEN_FILES
gen_file_dep =
else
gen_file_dep = |
endif
default: all
$(MBEDLIBS):
$(MAKE) -C $(MBEDTLS_PATH)/library
neat: clean
ifndef WINDOWS
rm -f $(GENERATED_FILES)
else
for %f in ($(subst /,\,$(GENERATED_FILES))) if exist %f del /Q /F %f
endif
# Auxiliary modules used by tests and some sample programs
MBEDTLS_CORE_TEST_OBJS = $(patsubst %.c,%.o,$(wildcard \
${MBEDTLS_TEST_PATH}/src/*.c \
${MBEDTLS_TEST_PATH}/src/drivers/*.c \
))
# Additional auxiliary modules for TLS testing
MBEDTLS_TLS_TEST_OBJS = $(patsubst %.c,%.o,$(wildcard \
${MBEDTLS_TEST_PATH}/src/test_helpers/*.c \
))
MBEDTLS_TEST_OBJS = $(MBEDTLS_CORE_TEST_OBJS) $(MBEDTLS_TLS_TEST_OBJS)

14
externals/mbedtls/scripts/config.pl vendored Executable file
View file

@ -0,0 +1,14 @@
#!/usr/bin/env perl
# Backward compatibility redirection
## Copyright The Mbed TLS Contributors
## SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
##
my $py = $0;
$py =~ s/\.pl$/.py/ or die "Unable to determine the name of the Python script";
exec 'python3', $py, @ARGV;
print STDERR "$0: python3: $!. Trying python instead.\n";
exec 'python', $py, @ARGV;
print STDERR "$0: python: $!\n";
exit 127;

602
externals/mbedtls/scripts/config.py vendored Executable file
View file

@ -0,0 +1,602 @@
#!/usr/bin/env python3
"""Mbed TLS configuration file manipulation library and tool
Basic usage, to read the Mbed TLS configuration:
config = ConfigFile()
if 'MBEDTLS_RSA_C' in config: print('RSA is enabled')
"""
# Note that as long as Mbed TLS 2.28 LTS is maintained, the version of
# this script in the mbedtls-2.28 branch must remain compatible with
# Python 3.4. The version in development may only use more recent features
# in parts that are not backported to 2.28.
## Copyright The Mbed TLS Contributors
## SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
##
import os
import re
class Setting:
"""Representation of one Mbed TLS mbedtls_config.h setting.
Fields:
* name: the symbol name ('MBEDTLS_xxx').
* value: the value of the macro. The empty string for a plain #define
with no value.
* active: True if name is defined, False if a #define for name is
present in mbedtls_config.h but commented out.
* section: the name of the section that contains this symbol.
"""
# pylint: disable=too-few-public-methods
def __init__(self, active, name, value='', section=None):
self.active = active
self.name = name
self.value = value
self.section = section
class Config:
"""Representation of the Mbed TLS configuration.
In the documentation of this class, a symbol is said to be *active*
if there is a #define for it that is not commented out, and *known*
if there is a #define for it whether commented out or not.
This class supports the following protocols:
* `name in config` is `True` if the symbol `name` is active, `False`
otherwise (whether `name` is inactive or not known).
* `config[name]` is the value of the macro `name`. If `name` is inactive,
raise `KeyError` (even if `name` is known).
* `config[name] = value` sets the value associated to `name`. `name`
must be known, but does not need to be set. This does not cause
name to become set.
"""
def __init__(self):
self.settings = {}
def __contains__(self, name):
"""True if the given symbol is active (i.e. set).
False if the given symbol is not set, even if a definition
is present but commented out.
"""
return name in self.settings and self.settings[name].active
def all(self, *names):
"""True if all the elements of names are active (i.e. set)."""
return all(self.__contains__(name) for name in names)
def any(self, *names):
"""True if at least one symbol in names are active (i.e. set)."""
return any(self.__contains__(name) for name in names)
def known(self, name):
"""True if a #define for name is present, whether it's commented out or not."""
return name in self.settings
def __getitem__(self, name):
"""Get the value of name, i.e. what the preprocessor symbol expands to.
If name is not known, raise KeyError. name does not need to be active.
"""
return self.settings[name].value
def get(self, name, default=None):
"""Get the value of name. If name is inactive (not set), return default.
If a #define for name is present and not commented out, return
its expansion, even if this is the empty string.
If a #define for name is present but commented out, return default.
"""
if name in self.settings:
return self.settings[name].value
else:
return default
def __setitem__(self, name, value):
"""If name is known, set its value.
If name is not known, raise KeyError.
"""
self.settings[name].value = value
def set(self, name, value=None):
"""Set name to the given value and make it active.
If value is None and name is already known, don't change its value.
If value is None and name is not known, set its value to the empty
string.
"""
if name in self.settings:
if value is not None:
self.settings[name].value = value
self.settings[name].active = True
else:
self.settings[name] = Setting(True, name, value=value)
def unset(self, name):
"""Make name unset (inactive).
name remains known if it was known before.
"""
if name not in self.settings:
return
self.settings[name].active = False
def adapt(self, adapter):
"""Run adapter on each known symbol and (de)activate it accordingly.
`adapter` must be a function that returns a boolean. It is called as
`adapter(name, active, section)` for each setting, where `active` is
`True` if `name` is set and `False` if `name` is known but unset,
and `section` is the name of the section containing `name`. If
`adapter` returns `True`, then set `name` (i.e. make it active),
otherwise unset `name` (i.e. make it known but inactive).
"""
for setting in self.settings.values():
setting.active = adapter(setting.name, setting.active,
setting.section)
def change_matching(self, regexs, enable):
"""Change all symbols matching one of the regexs to the desired state."""
if not regexs:
return
regex = re.compile('|'.join(regexs))
for setting in self.settings.values():
if regex.search(setting.name):
setting.active = enable
def is_full_section(section):
"""Is this section affected by "config.py full" and friends?"""
return section.endswith('support') or section.endswith('modules')
def realfull_adapter(_name, active, section):
"""Activate all symbols found in the global and boolean feature sections.
This is intended for building the documentation, including the
documentation of settings that are activated by defining an optional
preprocessor macro.
Do not activate definitions in the section containing symbols that are
supposed to be defined and documented in their own module.
"""
if section == 'Module configuration options':
return active
return True
# The goal of the full configuration is to have everything that can be tested
# together. This includes deprecated or insecure options. It excludes:
# * Options that require additional build dependencies or unusual hardware.
# * Options that make testing less effective.
# * Options that are incompatible with other options, or more generally that
# interact with other parts of the code in such a way that a bulk enabling
# is not a good way to test them.
# * Options that remove features.
EXCLUDE_FROM_FULL = frozenset([
#pylint: disable=line-too-long
'MBEDTLS_AES_ONLY_128_BIT_KEY_LENGTH', # interacts with CTR_DRBG_128_BIT_KEY
'MBEDTLS_AES_USE_HARDWARE_ONLY', # hardware dependency
'MBEDTLS_BLOCK_CIPHER_NO_DECRYPT', # incompatible with ECB in PSA, CBC/XTS/NIST_KW/DES
'MBEDTLS_CTR_DRBG_USE_128_BIT_KEY', # interacts with ENTROPY_FORCE_SHA256
'MBEDTLS_DEPRECATED_REMOVED', # conflicts with deprecated options
'MBEDTLS_DEPRECATED_WARNING', # conflicts with deprecated options
'MBEDTLS_ECDH_VARIANT_EVEREST_ENABLED', # influences the use of ECDH in TLS
'MBEDTLS_ECP_NO_FALLBACK', # removes internal ECP implementation
'MBEDTLS_ECP_WITH_MPI_UINT', # disables the default ECP and is experimental
'MBEDTLS_ENTROPY_FORCE_SHA256', # interacts with CTR_DRBG_128_BIT_KEY
'MBEDTLS_HAVE_SSE2', # hardware dependency
'MBEDTLS_MEMORY_BACKTRACE', # depends on MEMORY_BUFFER_ALLOC_C
'MBEDTLS_MEMORY_BUFFER_ALLOC_C', # makes sanitizers (e.g. ASan) less effective
'MBEDTLS_MEMORY_DEBUG', # depends on MEMORY_BUFFER_ALLOC_C
'MBEDTLS_NO_64BIT_MULTIPLICATION', # influences anything that uses bignum
'MBEDTLS_NO_DEFAULT_ENTROPY_SOURCES', # removes a feature
'MBEDTLS_NO_PLATFORM_ENTROPY', # removes a feature
'MBEDTLS_NO_UDBL_DIVISION', # influences anything that uses bignum
'MBEDTLS_PSA_P256M_DRIVER_ENABLED', # influences SECP256R1 KeyGen/ECDH/ECDSA
'MBEDTLS_PLATFORM_NO_STD_FUNCTIONS', # removes a feature
'MBEDTLS_PSA_CRYPTO_EXTERNAL_RNG', # behavior change + build dependency
'MBEDTLS_PSA_CRYPTO_KEY_ID_ENCODES_OWNER', # incompatible with USE_PSA_CRYPTO
'MBEDTLS_PSA_CRYPTO_SPM', # platform dependency (PSA SPM)
'MBEDTLS_PSA_INJECT_ENTROPY', # conflicts with platform entropy sources
'MBEDTLS_RSA_NO_CRT', # influences the use of RSA in X.509 and TLS
'MBEDTLS_SHA256_USE_A64_CRYPTO_ONLY', # interacts with *_USE_A64_CRYPTO_IF_PRESENT
'MBEDTLS_SHA256_USE_ARMV8_A_CRYPTO_ONLY', # interacts with *_USE_ARMV8_A_CRYPTO_IF_PRESENT
'MBEDTLS_SHA512_USE_A64_CRYPTO_ONLY', # interacts with *_USE_A64_CRYPTO_IF_PRESENT
'MBEDTLS_SHA256_USE_A64_CRYPTO_IF_PRESENT', # setting *_USE_ARMV8_A_CRYPTO is sufficient
'MBEDTLS_TEST_CONSTANT_FLOW_MEMSAN', # build dependency (clang+memsan)
'MBEDTLS_TEST_CONSTANT_FLOW_VALGRIND', # build dependency (valgrind headers)
'MBEDTLS_X509_REMOVE_INFO', # removes a feature
])
def is_seamless_alt(name):
"""Whether the xxx_ALT symbol should be included in the full configuration.
Include alternative implementations of platform functions, which are
configurable function pointers that default to the built-in function.
This way we test that the function pointers exist and build correctly
without changing the behavior, and tests can verify that the function
pointers are used by modifying those pointers.
Exclude alternative implementations of library functions since they require
an implementation of the relevant functions and an xxx_alt.h header.
"""
if name in (
'MBEDTLS_PLATFORM_GMTIME_R_ALT',
'MBEDTLS_PLATFORM_SETUP_TEARDOWN_ALT',
'MBEDTLS_PLATFORM_MS_TIME_ALT',
'MBEDTLS_PLATFORM_ZEROIZE_ALT',
):
# Similar to non-platform xxx_ALT, requires platform_alt.h
return False
return name.startswith('MBEDTLS_PLATFORM_')
def include_in_full(name):
"""Rules for symbols in the "full" configuration."""
if name in EXCLUDE_FROM_FULL:
return False
if name.endswith('_ALT'):
return is_seamless_alt(name)
return True
def full_adapter(name, active, section):
"""Config adapter for "full"."""
if not is_full_section(section):
return active
return include_in_full(name)
# The baremetal configuration excludes options that require a library or
# operating system feature that is typically not present on bare metal
# systems. Features that are excluded from "full" won't be in "baremetal"
# either (unless explicitly turned on in baremetal_adapter) so they don't
# need to be repeated here.
EXCLUDE_FROM_BAREMETAL = frozenset([
#pylint: disable=line-too-long
'MBEDTLS_ENTROPY_NV_SEED', # requires a filesystem and FS_IO or alternate NV seed hooks
'MBEDTLS_FS_IO', # requires a filesystem
'MBEDTLS_HAVE_TIME', # requires a clock
'MBEDTLS_HAVE_TIME_DATE', # requires a clock
'MBEDTLS_NET_C', # requires POSIX-like networking
'MBEDTLS_PLATFORM_FPRINTF_ALT', # requires FILE* from stdio.h
'MBEDTLS_PLATFORM_NV_SEED_ALT', # requires a filesystem and ENTROPY_NV_SEED
'MBEDTLS_PLATFORM_TIME_ALT', # requires a clock and HAVE_TIME
'MBEDTLS_PSA_CRYPTO_SE_C', # requires a filesystem and PSA_CRYPTO_STORAGE_C
'MBEDTLS_PSA_CRYPTO_STORAGE_C', # requires a filesystem
'MBEDTLS_PSA_ITS_FILE_C', # requires a filesystem
'MBEDTLS_THREADING_C', # requires a threading interface
'MBEDTLS_THREADING_PTHREAD', # requires pthread
'MBEDTLS_TIMING_C', # requires a clock
'MBEDTLS_SHA256_USE_A64_CRYPTO_IF_PRESENT', # requires an OS for runtime-detection
'MBEDTLS_SHA256_USE_ARMV8_A_CRYPTO_IF_PRESENT', # requires an OS for runtime-detection
'MBEDTLS_SHA512_USE_A64_CRYPTO_IF_PRESENT', # requires an OS for runtime-detection
])
def keep_in_baremetal(name):
"""Rules for symbols in the "baremetal" configuration."""
if name in EXCLUDE_FROM_BAREMETAL:
return False
return True
def baremetal_adapter(name, active, section):
"""Config adapter for "baremetal"."""
if not is_full_section(section):
return active
if name == 'MBEDTLS_NO_PLATFORM_ENTROPY':
# No OS-provided entropy source
return True
return include_in_full(name) and keep_in_baremetal(name)
# This set contains options that are mostly for debugging or test purposes,
# and therefore should be excluded when doing code size measurements.
# Options that are their own module (such as MBEDTLS_ERROR_C) are not listed
# and therefore will be included when doing code size measurements.
EXCLUDE_FOR_SIZE = frozenset([
'MBEDTLS_DEBUG_C', # large code size increase in TLS
'MBEDTLS_SELF_TEST', # increases the size of many modules
'MBEDTLS_TEST_HOOKS', # only useful with the hosted test framework, increases code size
])
def baremetal_size_adapter(name, active, section):
if name in EXCLUDE_FOR_SIZE:
return False
return baremetal_adapter(name, active, section)
def include_in_crypto(name):
"""Rules for symbols in a crypto configuration."""
if name.startswith('MBEDTLS_X509_') or \
name.startswith('MBEDTLS_SSL_') or \
name.startswith('MBEDTLS_KEY_EXCHANGE_'):
return False
if name in [
'MBEDTLS_DEBUG_C', # part of libmbedtls
'MBEDTLS_NET_C', # part of libmbedtls
'MBEDTLS_PKCS7_C', # part of libmbedx509
]:
return False
return True
def crypto_adapter(adapter):
"""Modify an adapter to disable non-crypto symbols.
``crypto_adapter(adapter)(name, active, section)`` is like
``adapter(name, active, section)``, but unsets all X.509 and TLS symbols.
"""
def continuation(name, active, section):
if not include_in_crypto(name):
return False
if adapter is None:
return active
return adapter(name, active, section)
return continuation
DEPRECATED = frozenset([
'MBEDTLS_PSA_CRYPTO_SE_C',
])
def no_deprecated_adapter(adapter):
"""Modify an adapter to disable deprecated symbols.
``no_deprecated_adapter(adapter)(name, active, section)`` is like
``adapter(name, active, section)``, but unsets all deprecated symbols
and sets ``MBEDTLS_DEPRECATED_REMOVED``.
"""
def continuation(name, active, section):
if name == 'MBEDTLS_DEPRECATED_REMOVED':
return True
if name in DEPRECATED:
return False
if adapter is None:
return active
return adapter(name, active, section)
return continuation
def no_platform_adapter(adapter):
"""Modify an adapter to disable platform symbols.
``no_platform_adapter(adapter)(name, active, section)`` is like
``adapter(name, active, section)``, but unsets all platform symbols other
``than MBEDTLS_PLATFORM_C.
"""
def continuation(name, active, section):
# Allow MBEDTLS_PLATFORM_C but remove all other platform symbols.
if name.startswith('MBEDTLS_PLATFORM_') and name != 'MBEDTLS_PLATFORM_C':
return False
if adapter is None:
return active
return adapter(name, active, section)
return continuation
class ConfigFile(Config):
"""Representation of the Mbed TLS configuration read for a file.
See the documentation of the `Config` class for methods to query
and modify the configuration.
"""
_path_in_tree = 'include/mbedtls/mbedtls_config.h'
default_path = [_path_in_tree,
os.path.join(os.path.dirname(__file__),
os.pardir,
_path_in_tree),
os.path.join(os.path.dirname(os.path.abspath(os.path.dirname(__file__))),
_path_in_tree)]
def __init__(self, filename=None):
"""Read the Mbed TLS configuration file."""
if filename is None:
for candidate in self.default_path:
if os.path.lexists(candidate):
filename = candidate
break
else:
raise Exception('Mbed TLS configuration file not found',
self.default_path)
super().__init__()
self.filename = filename
self.current_section = 'header'
with open(filename, 'r', encoding='utf-8') as file:
self.templates = [self._parse_line(line) for line in file]
self.current_section = None
def set(self, name, value=None):
if name not in self.settings:
self.templates.append((name, '', '#define ' + name + ' '))
super().set(name, value)
_define_line_regexp = (r'(?P<indentation>\s*)' +
r'(?P<commented_out>(//\s*)?)' +
r'(?P<define>#\s*define\s+)' +
r'(?P<name>\w+)' +
r'(?P<arguments>(?:\((?:\w|\s|,)*\))?)' +
r'(?P<separator>\s*)' +
r'(?P<value>.*)')
_section_line_regexp = (r'\s*/?\*+\s*[\\@]name\s+SECTION:\s*' +
r'(?P<section>.*)[ */]*')
_config_line_regexp = re.compile(r'|'.join([_define_line_regexp,
_section_line_regexp]))
def _parse_line(self, line):
"""Parse a line in mbedtls_config.h and return the corresponding template."""
line = line.rstrip('\r\n')
m = re.match(self._config_line_regexp, line)
if m is None:
return line
elif m.group('section'):
self.current_section = m.group('section')
return line
else:
active = not m.group('commented_out')
name = m.group('name')
value = m.group('value')
template = (name,
m.group('indentation'),
m.group('define') + name +
m.group('arguments') + m.group('separator'))
self.settings[name] = Setting(active, name, value,
self.current_section)
return template
def _format_template(self, name, indent, middle):
"""Build a line for mbedtls_config.h for the given setting.
The line has the form "<indent>#define <name> <value>"
where <middle> is "#define <name> ".
"""
setting = self.settings[name]
value = setting.value
if value is None:
value = ''
# Normally the whitespace to separate the symbol name from the
# value is part of middle, and there's no whitespace for a symbol
# with no value. But if a symbol has been changed from having a
# value to not having one, the whitespace is wrong, so fix it.
if value:
if middle[-1] not in '\t ':
middle += ' '
else:
middle = middle.rstrip()
return ''.join([indent,
'' if setting.active else '//',
middle,
value]).rstrip()
def write_to_stream(self, output):
"""Write the whole configuration to output."""
for template in self.templates:
if isinstance(template, str):
line = template
else:
line = self._format_template(*template)
output.write(line + '\n')
def write(self, filename=None):
"""Write the whole configuration to the file it was read from.
If filename is specified, write to this file instead.
"""
if filename is None:
filename = self.filename
with open(filename, 'w', encoding='utf-8') as output:
self.write_to_stream(output)
if __name__ == '__main__':
def main():
"""Command line mbedtls_config.h manipulation tool."""
parser = argparse.ArgumentParser(description="""
Mbed TLS configuration file manipulation tool.
""")
parser.add_argument('--file', '-f',
help="""File to read (and modify if requested).
Default: {}.
""".format(ConfigFile.default_path))
parser.add_argument('--force', '-o',
action='store_true',
help="""For the set command, if SYMBOL is not
present, add a definition for it.""")
parser.add_argument('--write', '-w', metavar='FILE',
help="""File to write to instead of the input file.""")
subparsers = parser.add_subparsers(dest='command',
title='Commands')
parser_get = subparsers.add_parser('get',
help="""Find the value of SYMBOL
and print it. Exit with
status 0 if a #define for SYMBOL is
found, 1 otherwise.
""")
parser_get.add_argument('symbol', metavar='SYMBOL')
parser_set = subparsers.add_parser('set',
help="""Set SYMBOL to VALUE.
If VALUE is omitted, just uncomment
the #define for SYMBOL.
Error out of a line defining
SYMBOL (commented or not) is not
found, unless --force is passed.
""")
parser_set.add_argument('symbol', metavar='SYMBOL')
parser_set.add_argument('value', metavar='VALUE', nargs='?',
default='')
parser_set_all = subparsers.add_parser('set-all',
help="""Uncomment all #define
whose name contains a match for
REGEX.""")
parser_set_all.add_argument('regexs', metavar='REGEX', nargs='*')
parser_unset = subparsers.add_parser('unset',
help="""Comment out the #define
for SYMBOL. Do nothing if none
is present.""")
parser_unset.add_argument('symbol', metavar='SYMBOL')
parser_unset_all = subparsers.add_parser('unset-all',
help="""Comment out all #define
whose name contains a match for
REGEX.""")
parser_unset_all.add_argument('regexs', metavar='REGEX', nargs='*')
def add_adapter(name, function, description):
subparser = subparsers.add_parser(name, help=description)
subparser.set_defaults(adapter=function)
add_adapter('baremetal', baremetal_adapter,
"""Like full, but exclude features that require platform
features such as file input-output.""")
add_adapter('baremetal_size', baremetal_size_adapter,
"""Like baremetal, but exclude debugging features.
Useful for code size measurements.""")
add_adapter('full', full_adapter,
"""Uncomment most features.
Exclude alternative implementations and platform support
options, as well as some options that are awkward to test.
""")
add_adapter('full_no_deprecated', no_deprecated_adapter(full_adapter),
"""Uncomment most non-deprecated features.
Like "full", but without deprecated features.
""")
add_adapter('full_no_platform', no_platform_adapter(full_adapter),
"""Uncomment most non-platform features.
Like "full", but without platform features.
""")
add_adapter('realfull', realfull_adapter,
"""Uncomment all boolean #defines.
Suitable for generating documentation, but not for building.""")
add_adapter('crypto', crypto_adapter(None),
"""Only include crypto features. Exclude X.509 and TLS.""")
add_adapter('crypto_baremetal', crypto_adapter(baremetal_adapter),
"""Like baremetal, but with only crypto features,
excluding X.509 and TLS.""")
add_adapter('crypto_full', crypto_adapter(full_adapter),
"""Like full, but with only crypto features,
excluding X.509 and TLS.""")
args = parser.parse_args()
config = ConfigFile(args.file)
if args.command is None:
parser.print_help()
return 1
elif args.command == 'get':
if args.symbol in config:
value = config[args.symbol]
if value:
sys.stdout.write(value + '\n')
return 0 if args.symbol in config else 1
elif args.command == 'set':
if not args.force and args.symbol not in config.settings:
sys.stderr.write("A #define for the symbol {} "
"was not found in {}\n"
.format(args.symbol, config.filename))
return 1
config.set(args.symbol, value=args.value)
elif args.command == 'set-all':
config.change_matching(args.regexs, True)
elif args.command == 'unset':
config.unset(args.symbol)
elif args.command == 'unset-all':
config.change_matching(args.regexs, False)
else:
config.adapt(args.adapter)
config.write(args.write)
return 0
# Import modules only used by main only if main is defined and called.
# pylint: disable=wrong-import-position
import argparse
import sys
sys.exit(main())

View file

@ -0,0 +1,71 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"properties": {
"_comment": {
"type": "string"
},
"prefix": {
"type": "string",
"pattern": "^[A-Z_a-z][0-9A-Z_a-z]*$"
},
"type": {
"type": "string",
"const": ["opaque"]
},
"location": {
"type": ["integer","string"],
"pattern": "^(0x|0X)?[a-fA-F0-9]+$"
},
"mbedtls/h_condition": {
"type": "string"
},
"headers": {
"type": "array",
"items": {
"type": "string"
},
"default": []
},
"capabilities": {
"type": "array",
"items": [
{
"type": "object",
"properties": {
"_comment": {
"type": "string"
},
"mbedtls/c_condition": {
"type": "string"
},
"entry_points": {
"type": "array",
"items": {
"type": "string"
}
},
"names": {
"type": "object",
"patternProperties": {
"^[A-Z_a-z][0-9A-Z_a-z]*$": {
"type": "string",
"pattern": "^[A-Z_a-z][0-9A-Z_a-z]*$"
}
}
}
},
"required": [
"entry_points"
]
}
]
}
},
"required": [
"prefix",
"type",
"location",
"capabilities"
]
}

View file

@ -0,0 +1,70 @@
{
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"properties": {
"_comment": {
"type": "string"
},
"prefix": {
"type": "string",
"pattern": "^[A-Z_a-z][0-9A-Z_a-z]*$"
},
"type": {
"type": "string",
"const": ["transparent"]
},
"mbedtls/h_condition": {
"type": "string"
},
"headers": {
"type": "array",
"items": {
"type": "string"
},
"default": []
},
"capabilities": {
"type": "array",
"items": [
{
"type": "object",
"properties": {
"_comment": {
"type": "string"
},
"mbedtls/c_condition": {
"type": "string"
},
"entry_points": {
"type": "array",
"items": {
"type": "string"
}
},
"names": {
"type": "object",
"patternProperties": {
"^[A-Z_a-z][0-9A-Z_a-z]*$": {
"type": "string",
"pattern": "^[A-Z_a-z][0-9A-Z_a-z]*$"
}
}
},
"fallback": {
"type": "boolean",
"default": "false"
}
},
"required": [
"entry_points"
]
}
]
}
},
"required": [
"prefix",
"type",
"capabilities"
]
}

View file

@ -0,0 +1 @@
["mbedtls_test_opaque_driver.json","mbedtls_test_transparent_driver.json","p256_transparent_driver.json"]

View file

@ -0,0 +1,20 @@
{
"prefix": "mbedtls_test",
"type": "opaque",
"location": "0x7fffff",
"mbedtls/h_condition": "defined(PSA_CRYPTO_DRIVER_TEST)",
"headers": ["test/drivers/test_driver.h"],
"capabilities": [
{
"_comment": "The Mbed TLS opaque driver supports import key/export key/export_public key",
"mbedtls/c_condition": "defined(PSA_CRYPTO_DRIVER_TEST)",
"entry_points": ["import_key", "export_key", "export_public_key"]
},
{
"_comment": "The Mbed TLS opaque driver supports copy key/ get builtin key",
"mbedtls/c_condition": "defined(PSA_CRYPTO_DRIVER_TEST)",
"entry_points": ["copy_key", "get_builtin_key"],
"names": {"copy_key":"mbedtls_test_opaque_copy_key", "get_builtin_key":"mbedtls_test_opaque_get_builtin_key"}
}
]
}

View file

@ -0,0 +1,22 @@
{
"prefix": "mbedtls_test",
"type": "transparent",
"mbedtls/h_condition": "defined(PSA_CRYPTO_DRIVER_TEST)",
"headers": ["test/drivers/test_driver.h"],
"capabilities": [
{
"_comment": "The Mbed TLS transparent driver supports import key/export key",
"mbedtls/c_condition": "defined(PSA_CRYPTO_DRIVER_TEST)",
"entry_points": ["import_key"],
"fallback": true
},
{
"_comment": "The Mbed TLS transparent driver supports export_public key",
"mbedtls/c_condition": "defined(PSA_CRYPTO_DRIVER_TEST)",
"entry_points": ["export_public_key"],
"fallback": true,
"names": {"export_public_key":"mbedtls_test_transparent_export_public_key"}
}
]
}

View file

@ -0,0 +1,20 @@
{
"prefix": "p256",
"type": "transparent",
"mbedtls/h_condition": "defined(MBEDTLS_PSA_P256M_DRIVER_ENABLED)",
"headers": ["../3rdparty/p256-m/p256-m_driver_entrypoints.h"],
"capabilities": [
{
"mbedtls/c_condition": "defined(MBEDTLS_PSA_P256M_DRIVER_ENABLED)",
"_comment_entry_points": "This is not the complete list of entry points supported by this driver, only those that are currently supported in JSON. See docs/psa-driver-example-and-guide.md",
"entry_points": ["import_key", "export_public_key"],
"algorithms": ["PSA_ALG_ECDH", "PSA_ALG_ECDSA(PSA_ALG_ANY_HASH)"],
"key_types": [
"PSA_KEY_TYPE_ECC_KEY_PAIR(PSA_ECC_FAMILY_SECP_R1)",
"PSA_KEY_TYPE_ECC_PUBLIC_KEY(PSA_ECC_FAMILY_SECP_R1)"
],
"key_sizes": [256],
"fallback": false
}
]
}

View file

@ -0,0 +1,17 @@
{# One Shot function's dispatch code for opaque drivers.
Expected inputs:
* drivers: the list of driver descriptions.
* entry_point: the name of the entry point that this function dispatches to.
* entry_point_param(driver): the parameters to pass to the entry point.
* nest_indent: number of extra spaces to indent the code to.
-#}
{% for driver in drivers if driver.type == "opaque" -%}
{% for capability in driver.capabilities if entry_point in capability.entry_points -%}
#if ({% if capability['mbedtls/c_condition'] is defined -%}{{ capability['mbedtls/c_condition'] }} {% else -%} {{ 1 }} {% endif %})
{%- filter indent(width = nest_indent) %}
case {{ driver.location }}:
return( {{ entry_point_name(capability, entry_point, driver) }}({{entry_point_param(driver) | indent(20)}}));
{% endfilter -%}
#endif
{% endfor %}
{% endfor %}

View file

@ -0,0 +1,19 @@
{# One Shot function's dispatch code for transparent drivers.
Expected inputs:
* drivers: the list of driver descriptions.
* entry_point: the name of the entry point that this function dispatches to.
* entry_point_param(driver): the parameters to pass to the entry point.
* nest_indent: number of extra spaces to indent the code to.
-#}
{% for driver in drivers if driver.type == "transparent" -%}
{% for capability in driver.capabilities if entry_point in capability.entry_points -%}
#if ({% if capability['mbedtls/c_condition'] is defined -%}{{ capability['mbedtls/c_condition'] }} {% else -%} {{ 1 }} {% endif %})
{%- filter indent(width = nest_indent) %}
status = {{ entry_point_name(capability, entry_point, driver) }}({{entry_point_param(driver) | indent(20)}});
if( status != PSA_ERROR_NOT_SUPPORTED )
return( status );
{% endfilter -%}
#endif
{% endfor %}
{% endfor %}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,224 @@
/*
* Functions to delegate cryptographic operations to an available
* and appropriate accelerator.
* Warning: This file is now auto-generated.
*/
/* Copyright The Mbed TLS Contributors
* SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
*/
/* BEGIN-common headers */
#include "common.h"
#include "psa_crypto_aead.h"
#include "psa_crypto_cipher.h"
#include "psa_crypto_core.h"
#include "psa_crypto_driver_wrappers_no_static.h"
#include "psa_crypto_hash.h"
#include "psa_crypto_mac.h"
#include "psa_crypto_pake.h"
#include "psa_crypto_rsa.h"
#include "mbedtls/platform.h"
/* END-common headers */
#if defined(MBEDTLS_PSA_CRYPTO_C)
/* BEGIN-driver headers */
{% for driver in drivers -%}
/* Headers for {{driver.prefix}} {{driver.type}} driver */
{% if driver['mbedtls/h_condition'] is defined -%}
#if {{ driver['mbedtls/h_condition'] }}
{% endif -%}
{% for header in driver.headers -%}
#include "{{ header }}"
{% endfor %}
{% if driver['mbedtls/h_condition'] is defined -%}
#endif
{% endif -%}
{% endfor %}
/* END-driver headers */
/* Auto-generated values depending on which drivers are registered.
* ID 0 is reserved for unallocated operations.
* ID 1 is reserved for the Mbed TLS software driver. */
/* BEGIN-driver id definition */
#define PSA_CRYPTO_MBED_TLS_DRIVER_ID (1)
{% for driver in drivers -%}
#define {{(driver.prefix + "_" + driver.type + "_driver_id").upper()}} ({{ loop.index + 1 }})
{% endfor %}
/* END-driver id */
/* BEGIN-Common Macro definitions */
{% macro entry_point_name(capability, entry_point, driver) -%}
{% if capability.name is defined and entry_point in capability.names.keys() -%}
{{ capability.names[entry_point]}}
{% else -%}
{{driver.prefix}}_{{driver.type}}_{{entry_point}}
{% endif -%}
{% endmacro %}
/* END-Common Macro definitions */
/* Support the 'old' SE interface when asked to */
#if defined(MBEDTLS_PSA_CRYPTO_SE_C)
/* PSA_CRYPTO_DRIVER_PRESENT is defined when either a new-style or old-style
* SE driver is present, to avoid unused argument errors at compile time. */
#ifndef PSA_CRYPTO_DRIVER_PRESENT
#define PSA_CRYPTO_DRIVER_PRESENT
#endif
#include "psa_crypto_se.h"
#endif
/** Get the key buffer size required to store the key material of a key
* associated with an opaque driver.
*
* \param[in] attributes The key attributes.
* \param[out] key_buffer_size Minimum buffer size to contain the key material
*
* \retval #PSA_SUCCESS
* The minimum size for a buffer to contain the key material has been
* returned successfully.
* \retval #PSA_ERROR_NOT_SUPPORTED
* The type and/or the size in bits of the key or the combination of
* the two is not supported.
* \retval #PSA_ERROR_INVALID_ARGUMENT
* The key is declared with a lifetime not known to us.
*/
psa_status_t psa_driver_wrapper_get_key_buffer_size(
const psa_key_attributes_t *attributes,
size_t *key_buffer_size )
{
psa_key_location_t location = PSA_KEY_LIFETIME_GET_LOCATION( attributes->core.lifetime );
psa_key_type_t key_type = attributes->core.type;
size_t key_bits = attributes->core.bits;
*key_buffer_size = 0;
switch( location )
{
#if defined(PSA_CRYPTO_DRIVER_TEST)
case PSA_CRYPTO_TEST_DRIVER_LOCATION:
#if defined(MBEDTLS_PSA_CRYPTO_BUILTIN_KEYS)
/* Emulate property 'builtin_key_size' */
if( psa_key_id_is_builtin(
MBEDTLS_SVC_KEY_ID_GET_KEY_ID(
psa_get_key_id( attributes ) ) ) )
{
*key_buffer_size = sizeof( psa_drv_slot_number_t );
return( PSA_SUCCESS );
}
#endif /* MBEDTLS_PSA_CRYPTO_BUILTIN_KEYS */
*key_buffer_size = mbedtls_test_opaque_size_function( key_type,
key_bits );
return( ( *key_buffer_size != 0 ) ?
PSA_SUCCESS : PSA_ERROR_NOT_SUPPORTED );
#endif /* PSA_CRYPTO_DRIVER_TEST */
default:
(void)key_type;
(void)key_bits;
return( PSA_ERROR_INVALID_ARGUMENT );
}
}
psa_status_t psa_driver_wrapper_export_public_key(
const psa_key_attributes_t *attributes,
const uint8_t *key_buffer, size_t key_buffer_size,
uint8_t *data, size_t data_size, size_t *data_length )
{
{% with entry_point = "export_public_key" -%}
{% macro entry_point_param(driver) -%}
attributes,
key_buffer,
key_buffer_size,
data,
data_size,
data_length
{% endmacro %}
psa_status_t status = PSA_ERROR_INVALID_ARGUMENT;
psa_key_location_t location = PSA_KEY_LIFETIME_GET_LOCATION(
psa_get_key_lifetime( attributes ) );
/* Try dynamically-registered SE interface first */
#if defined(MBEDTLS_PSA_CRYPTO_SE_C)
const psa_drv_se_t *drv;
psa_drv_se_context_t *drv_context;
if( psa_get_se_driver( attributes->core.lifetime, &drv, &drv_context ) )
{
if( ( drv->key_management == NULL ) ||
( drv->key_management->p_export_public == NULL ) )
{
return( PSA_ERROR_NOT_SUPPORTED );
}
return( drv->key_management->p_export_public(
drv_context,
*( (psa_key_slot_number_t *)key_buffer ),
data, data_size, data_length ) );
}
#endif /* MBEDTLS_PSA_CRYPTO_SE_C */
switch( location )
{
case PSA_KEY_LOCATION_LOCAL_STORAGE:
/* Key is stored in the slot in export representation, so
* cycle through all known transparent accelerators */
#if defined(PSA_CRYPTO_ACCELERATOR_DRIVER_PRESENT)
{% with nest_indent=12 %}
{% include "OS-template-transparent.jinja" -%}
{% endwith -%}
#endif /* PSA_CRYPTO_ACCELERATOR_DRIVER_PRESENT */
/* Fell through, meaning no accelerator supports this operation */
return( psa_export_public_key_internal( attributes,
key_buffer,
key_buffer_size,
data,
data_size,
data_length ) );
/* Add cases for opaque driver here */
#if defined(PSA_CRYPTO_ACCELERATOR_DRIVER_PRESENT)
{% with nest_indent=8 %}
{% include "OS-template-opaque.jinja" -%}
{% endwith -%}
#endif /* PSA_CRYPTO_ACCELERATOR_DRIVER_PRESENT */
default:
/* Key is declared with a lifetime not known to us */
return( status );
}
{% endwith %}
}
psa_status_t psa_driver_wrapper_get_builtin_key(
psa_drv_slot_number_t slot_number,
psa_key_attributes_t *attributes,
uint8_t *key_buffer, size_t key_buffer_size, size_t *key_buffer_length )
{
{% with entry_point = "get_builtin_key" -%}
{% macro entry_point_param(driver) -%}
slot_number,
attributes,
key_buffer,
key_buffer_size,
key_buffer_length
{% endmacro %}
psa_key_location_t location = PSA_KEY_LIFETIME_GET_LOCATION( attributes->core.lifetime );
switch( location )
{
#if defined(PSA_CRYPTO_DRIVER_TEST)
{% with nest_indent=8 %}
{% include "OS-template-opaque.jinja" -%}
{% endwith -%}
#endif /* PSA_CRYPTO_DRIVER_TEST */
default:
(void) slot_number;
(void) key_buffer;
(void) key_buffer_size;
(void) key_buffer_length;
return( PSA_ERROR_DOES_NOT_EXIST );
}
{% endwith %}
}
#endif /* MBEDTLS_PSA_CRYPTO_C */

View file

@ -0,0 +1,159 @@
/*
* Error message information
*
* Copyright The Mbed TLS Contributors
* SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
*/
#include "common.h"
#include "mbedtls/error.h"
#if defined(MBEDTLS_ERROR_C) || defined(MBEDTLS_ERROR_STRERROR_DUMMY)
#if defined(MBEDTLS_ERROR_C)
#include "mbedtls/platform.h"
#include <stdio.h>
#include <string.h>
HEADER_INCLUDED
const char *mbedtls_high_level_strerr(int error_code)
{
int high_level_error_code;
if (error_code < 0) {
error_code = -error_code;
}
/* Extract the high-level part from the error code. */
high_level_error_code = error_code & 0xFF80;
switch (high_level_error_code) {
/* Begin Auto-Generated Code. */
HIGH_LEVEL_CODE_CHECKS
/* End Auto-Generated Code. */
default:
break;
}
return NULL;
}
const char *mbedtls_low_level_strerr(int error_code)
{
int low_level_error_code;
if (error_code < 0) {
error_code = -error_code;
}
/* Extract the low-level part from the error code. */
low_level_error_code = error_code & ~0xFF80;
switch (low_level_error_code) {
/* Begin Auto-Generated Code. */
LOW_LEVEL_CODE_CHECKS
/* End Auto-Generated Code. */
default:
break;
}
return NULL;
}
void mbedtls_strerror(int ret, char *buf, size_t buflen)
{
size_t len;
int use_ret;
const char *high_level_error_description = NULL;
const char *low_level_error_description = NULL;
if (buflen == 0) {
return;
}
memset(buf, 0x00, buflen);
if (ret < 0) {
ret = -ret;
}
if (ret & 0xFF80) {
use_ret = ret & 0xFF80;
// Translate high level error code.
high_level_error_description = mbedtls_high_level_strerr(ret);
if (high_level_error_description == NULL) {
mbedtls_snprintf(buf, buflen, "UNKNOWN ERROR CODE (%04X)", (unsigned int) use_ret);
} else {
mbedtls_snprintf(buf, buflen, "%s", high_level_error_description);
}
#if defined(MBEDTLS_SSL_TLS_C)
// Early return in case of a fatal error - do not try to translate low
// level code.
if (use_ret == -(MBEDTLS_ERR_SSL_FATAL_ALERT_MESSAGE)) {
return;
}
#endif /* MBEDTLS_SSL_TLS_C */
}
use_ret = ret & ~0xFF80;
if (use_ret == 0) {
return;
}
// If high level code is present, make a concatenation between both
// error strings.
//
len = strlen(buf);
if (len > 0) {
if (buflen - len < 5) {
return;
}
mbedtls_snprintf(buf + len, buflen - len, " : ");
buf += len + 3;
buflen -= len + 3;
}
// Translate low level error code.
low_level_error_description = mbedtls_low_level_strerr(ret);
if (low_level_error_description == NULL) {
mbedtls_snprintf(buf, buflen, "UNKNOWN ERROR CODE (%04X)", (unsigned int) use_ret);
} else {
mbedtls_snprintf(buf, buflen, "%s", low_level_error_description);
}
}
#else /* MBEDTLS_ERROR_C */
/*
* Provide a dummy implementation when MBEDTLS_ERROR_C is not defined
*/
void mbedtls_strerror(int ret, char *buf, size_t buflen)
{
((void) ret);
if (buflen > 0) {
buf[0] = '\0';
}
}
#endif /* MBEDTLS_ERROR_C */
#if defined(MBEDTLS_TEST_HOOKS)
void (*mbedtls_test_hook_error_add)(int, int, const char *, int);
#endif
#endif /* MBEDTLS_ERROR_C || MBEDTLS_ERROR_STRERROR_DUMMY */

View file

@ -0,0 +1,121 @@
/*
* Query Mbed TLS compile time configurations from mbedtls_config.h
*
* Copyright The Mbed TLS Contributors
* SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
*/
#include "mbedtls/build_info.h"
#include "query_config.h"
#include "mbedtls/platform.h"
/*
* Include all the headers with public APIs in case they define a macro to its
* default value when that configuration is not set in mbedtls_config.h, or
* for PSA_WANT macros, in case they're auto-defined based on mbedtls_config.h
* rather than defined directly in crypto_config.h.
*/
#include "psa/crypto.h"
#include "mbedtls/aes.h"
#include "mbedtls/aria.h"
#include "mbedtls/asn1.h"
#include "mbedtls/asn1write.h"
#include "mbedtls/base64.h"
#include "mbedtls/bignum.h"
#include "mbedtls/camellia.h"
#include "mbedtls/ccm.h"
#include "mbedtls/chacha20.h"
#include "mbedtls/chachapoly.h"
#include "mbedtls/cipher.h"
#include "mbedtls/cmac.h"
#include "mbedtls/ctr_drbg.h"
#include "mbedtls/debug.h"
#include "mbedtls/des.h"
#include "mbedtls/dhm.h"
#include "mbedtls/ecdh.h"
#include "mbedtls/ecdsa.h"
#include "mbedtls/ecjpake.h"
#include "mbedtls/ecp.h"
#include "mbedtls/entropy.h"
#include "mbedtls/error.h"
#include "mbedtls/gcm.h"
#include "mbedtls/hkdf.h"
#include "mbedtls/hmac_drbg.h"
#include "mbedtls/md.h"
#include "mbedtls/md5.h"
#include "mbedtls/memory_buffer_alloc.h"
#include "mbedtls/net_sockets.h"
#include "mbedtls/nist_kw.h"
#include "mbedtls/oid.h"
#include "mbedtls/pem.h"
#include "mbedtls/pk.h"
#include "mbedtls/pkcs12.h"
#include "mbedtls/pkcs5.h"
#if defined(MBEDTLS_HAVE_TIME)
#include "mbedtls/platform_time.h"
#endif
#include "mbedtls/platform_util.h"
#include "mbedtls/poly1305.h"
#include "mbedtls/ripemd160.h"
#include "mbedtls/rsa.h"
#include "mbedtls/sha1.h"
#include "mbedtls/sha256.h"
#include "mbedtls/sha512.h"
#include "mbedtls/ssl.h"
#include "mbedtls/ssl_cache.h"
#include "mbedtls/ssl_ciphersuites.h"
#include "mbedtls/ssl_cookie.h"
#include "mbedtls/ssl_ticket.h"
#include "mbedtls/threading.h"
#include "mbedtls/timing.h"
#include "mbedtls/version.h"
#include "mbedtls/x509.h"
#include "mbedtls/x509_crl.h"
#include "mbedtls/x509_crt.h"
#include "mbedtls/x509_csr.h"
#include <string.h>
/*
* Helper macros to convert a macro or its expansion into a string
* WARNING: This does not work for expanding function-like macros. However,
* Mbed TLS does not currently have configuration options used in this fashion.
*/
#define MACRO_EXPANSION_TO_STR(macro) MACRO_NAME_TO_STR(macro)
#define MACRO_NAME_TO_STR(macro) \
mbedtls_printf("%s", strlen( #macro "") > 0 ? #macro "\n" : "")
#define STRINGIFY(macro) #macro
#define OUTPUT_MACRO_NAME_VALUE(macro) mbedtls_printf( #macro "%s\n", \
(STRINGIFY(macro) "")[0] != 0 ? "=" STRINGIFY( \
macro) : "")
#if defined(_MSC_VER)
/*
* Visual Studio throws the warning 4003 because many Mbed TLS feature macros
* are defined empty. This means that from the preprocessor's point of view
* the macro MBEDTLS_EXPANSION_TO_STR is being invoked without arguments as
* some macros expand to nothing. We suppress that specific warning to get a
* clean build and to ensure that tests treating warnings as errors do not
* fail.
*/
#pragma warning(push)
#pragma warning(disable:4003)
#endif /* _MSC_VER */
int query_config(const char *config)
{
CHECK_CONFIG /* If the symbol is not found, return an error */
return 1;
}
void list_config(void)
{
LIST_CONFIG
}
#if defined(_MSC_VER)
#pragma warning(pop)
#endif /* _MSC_VER */

View file

@ -0,0 +1,50 @@
/*
* Version feature information
*
* Copyright The Mbed TLS Contributors
* SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
*/
#include "common.h"
#if defined(MBEDTLS_VERSION_C)
#include "mbedtls/version.h"
#include <string.h>
static const char * const features[] = {
#if defined(MBEDTLS_VERSION_FEATURES)
FEATURE_DEFINES
#endif /* MBEDTLS_VERSION_FEATURES */
NULL
};
int mbedtls_version_check_feature(const char *feature)
{
const char * const *idx = features;
if (*idx == NULL) {
return -2;
}
if (feature == NULL) {
return -1;
}
if (strncmp(feature, "MBEDTLS_", 8)) {
return -1;
}
feature += 8;
while (*idx != NULL) {
if (!strcmp(*idx, feature)) {
return 0;
}
idx++;
}
return -1;
}
#endif /* MBEDTLS_VERSION_C */

View file

@ -0,0 +1,171 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug|x64">
<Configuration>Debug</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|x64">
<Configuration>Release</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
</ItemGroup>
<ItemGroup>
<SOURCES>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="mbedTLS.vcxproj">
<Project>{46cf2d25-6a36-4189-b59c-e4815388e554}</Project>
<LinkLibraryDependencies>true</LinkLibraryDependencies>
</ProjectReference>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid><GUID></ProjectGuid>
<Keyword>Win32Proj</Keyword>
<RootNamespace><APPNAME></RootNamespace>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<LinkIncremental>true</LinkIncremental>
<IntDir>$(Configuration)\$(TargetName)\</IntDir>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<LinkIncremental>true</LinkIncremental>
<IntDir>$(Configuration)\$(TargetName)\</IntDir>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<LinkIncremental>false</LinkIncremental>
<IntDir>$(Configuration)\$(TargetName)\</IntDir>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<LinkIncremental>false</LinkIncremental>
<IntDir>$(Configuration)\$(TargetName)\</IntDir>
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>
INCLUDE_DIRECTORIES
</AdditionalIncludeDirectories>
</ClCompile>
<Link>
<SubSystem>Console</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<AdditionalDependencies>bcrypt.lib;%(AdditionalDependencies)</AdditionalDependencies>
<AdditionalLibraryDirectories>Debug</AdditionalLibraryDirectories>
</Link>
<ProjectReference>
<LinkLibraryDependencies>false</LinkLibraryDependencies>
</ProjectReference>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>
INCLUDE_DIRECTORIES
</AdditionalIncludeDirectories>
</ClCompile>
<Link>
<SubSystem>Console</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<AdditionalDependencies>bcrypt.lib;%(AdditionalDependencies)</AdditionalDependencies>
<AdditionalLibraryDirectories>Debug</AdditionalLibraryDirectories>
</Link>
<ProjectReference>
<LinkLibraryDependencies>false</LinkLibraryDependencies>
</ProjectReference>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>NDEBUG;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>
INCLUDE_DIRECTORIES
</AdditionalIncludeDirectories>
</ClCompile>
<Link>
<SubSystem>Console</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
<AdditionalLibraryDirectories>Release</AdditionalLibraryDirectories>
<AdditionalDependencies>bcrypt.lib;%(AdditionalDependencies)</AdditionalDependencies>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>NDEBUG;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>
INCLUDE_DIRECTORIES
</AdditionalIncludeDirectories>
</ClCompile>
<Link>
<SubSystem>Console</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
<AdditionalLibraryDirectories>Release</AdditionalLibraryDirectories>
<AdditionalDependencies>bcrypt.lib;%(AdditionalDependencies)</AdditionalDependencies>
</Link>
</ItemDefinitionGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

View file

@ -0,0 +1,159 @@
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug|x64">
<Configuration>Debug</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|x64">
<Configuration>Release</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{46CF2D25-6A36-4189-B59C-E4815388E554}</ProjectGuid>
<Keyword>Win32Proj</Keyword>
<RootNamespace>mbedTLS</RootNamespace>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="Configuration">
<ConfigurationType>StaticLibrary</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<LinkIncremental>true</LinkIncremental>
<IntDir>$(Configuration)\$(TargetName)\</IntDir>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<LinkIncremental>true</LinkIncremental>
<IntDir>$(Configuration)\$(TargetName)\</IntDir>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<LinkIncremental>false</LinkIncremental>
<IntDir>$(Configuration)\$(TargetName)\</IntDir>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<LinkIncremental>false</LinkIncremental>
<IntDir>$(Configuration)\$(TargetName)\</IntDir>
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>_USRDLL;MBEDTLS_EXPORTS;KRML_VERIFIED_UINT128;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>
INCLUDE_DIRECTORIES
</AdditionalIncludeDirectories>
<CompileAs>CompileAsC</CompileAs>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<AdditionalDependencies>bcrypt.lib;%(AdditionalDependencies)</AdditionalDependencies>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>_USRDLL;MBEDTLS_EXPORTS;KRML_VERIFIED_UINT128;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>
INCLUDE_DIRECTORIES
</AdditionalIncludeDirectories>
<CompileAs>CompileAsC</CompileAs>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<AdditionalDependencies>bcrypt.lib;%(AdditionalDependencies)</AdditionalDependencies>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>NDEBUG;_USRDLL;MBEDTLS_EXPORTS;KRML_VERIFIED_UINT128;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>
INCLUDE_DIRECTORIES
</AdditionalIncludeDirectories>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
<AdditionalDependencies>bcrypt.lib;%(AdditionalDependencies)</AdditionalDependencies>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>WIN64;NDEBUG;_WINDOWS;_USRDLL;MBEDTLS_EXPORTS;KRML_VERIFIED_UINT128;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<AdditionalIncludeDirectories>
INCLUDE_DIRECTORIES
</AdditionalIncludeDirectories>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
</Link>
</ItemDefinitionGroup>
<ItemGroup>
HEADER_ENTRIES
</ItemGroup>
<ItemGroup>
SOURCE_ENTRIES
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>

View file

@ -0,0 +1,30 @@

Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio 2013
VisualStudioVersion = 12.0.31101.0
MinimumVisualStudioVersion = 10.0.40219.1
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "mbedTLS", "mbedTLS.vcxproj", "{46CF2D25-6A36-4189-B59C-E4815388E554}"
EndProject
APP_ENTRIES
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Win32 = Debug|Win32
Debug|x64 = Debug|x64
Release|Win32 = Release|Win32
Release|x64 = Release|x64
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{46CF2D25-6A36-4189-B59C-E4815388E554}.Debug|Win32.ActiveCfg = Debug|Win32
{46CF2D25-6A36-4189-B59C-E4815388E554}.Debug|Win32.Build.0 = Debug|Win32
{46CF2D25-6A36-4189-B59C-E4815388E554}.Debug|x64.ActiveCfg = Debug|x64
{46CF2D25-6A36-4189-B59C-E4815388E554}.Debug|x64.Build.0 = Debug|x64
{46CF2D25-6A36-4189-B59C-E4815388E554}.Release|Win32.ActiveCfg = Release|Win32
{46CF2D25-6A36-4189-B59C-E4815388E554}.Release|Win32.Build.0 = Release|Win32
{46CF2D25-6A36-4189-B59C-E4815388E554}.Release|x64.ActiveCfg = Release|x64
{46CF2D25-6A36-4189-B59C-E4815388E554}.Release|x64.Build.0 = Release|x64
CONF_ENTRIES
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
EndGlobal

View file

@ -0,0 +1,19 @@
# Python package requirements for driver implementers.
# Jinja2 <3.0 needs an older version of markupsafe, but does not
# declare it.
# https://github.com/pallets/markupsafe/issues/282
# https://github.com/pallets/jinja/issues/1585
markupsafe < 2.1
# Use the version of Jinja that's in Ubuntu 20.04.
# See https://github.com/Mbed-TLS/mbedtls/pull/5067#discussion_r738794607 .
# Note that Jinja 3.0 drops support for Python 3.5, so we need to support
# Jinja 2.x as long as we're still using Python 3.5 anywhere.
# Jinja 2.10.1 doesn't support Python 3.10+
Jinja2 >= 2.10.1; python_version < '3.10'
Jinja2 >= 2.10.3; python_version >= '3.10'
# Jinja2 >=2.10, <3.0 needs a separate package for type annotations
types-Jinja2 >= 2.11.9
jsonschema >= 3.2.0
types-jsonschema >= 3.2.0

87
externals/mbedtls/scripts/ecc-heap.sh vendored Executable file
View file

@ -0,0 +1,87 @@
#!/bin/sh
# Measure heap usage (and performance) of ECC operations with various values of
# the relevant tunable compile-time parameters.
#
# Usage (preferably on a 32-bit platform):
# cmake -D CMAKE_BUILD_TYPE=Release .
# scripts/ecc-heap.sh | tee ecc-heap.log
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
set -eu
CONFIG_H='include/mbedtls/mbedtls_config.h'
if [ -r $CONFIG_H ]; then :; else
echo "$CONFIG_H not found" >&2
exit 1
fi
if grep -i cmake Makefile >/dev/null; then :; else
echo "Needs Cmake" >&2
exit 1
fi
if git status | grep -F $CONFIG_H >/dev/null 2>&1; then
echo "mbedtls_config.h not clean" >&2
exit 1
fi
CONFIG_BAK=${CONFIG_H}.bak
cp $CONFIG_H $CONFIG_BAK
cat << EOF >$CONFIG_H
#define MBEDTLS_PLATFORM_C
#define MBEDTLS_PLATFORM_MEMORY
#define MBEDTLS_MEMORY_BUFFER_ALLOC_C
#define MBEDTLS_MEMORY_DEBUG
#define MBEDTLS_TIMING_C
#define MBEDTLS_BIGNUM_C
#define MBEDTLS_ECP_C
#define MBEDTLS_ASN1_PARSE_C
#define MBEDTLS_ASN1_WRITE_C
#define MBEDTLS_ECDSA_C
#define MBEDTLS_SHA256_C // ECDSA benchmark needs it
#define MBEDTLS_SHA224_C // SHA256 requires this for now
#define MBEDTLS_ECDH_C
// NIST curves >= 256 bits
#define MBEDTLS_ECP_DP_SECP256R1_ENABLED
#define MBEDTLS_ECP_DP_SECP384R1_ENABLED
#define MBEDTLS_ECP_DP_SECP521R1_ENABLED
// SECP "koblitz-like" curve >= 256 bits
#define MBEDTLS_ECP_DP_SECP256K1_ENABLED
// Brainpool curves (no specialised "mod p" routine)
#define MBEDTLS_ECP_DP_BP256R1_ENABLED
#define MBEDTLS_ECP_DP_BP384R1_ENABLED
#define MBEDTLS_ECP_DP_BP512R1_ENABLED
// Montgomery curves
#define MBEDTLS_ECP_DP_CURVE25519_ENABLED
#define MBEDTLS_ECP_DP_CURVE448_ENABLED
#define MBEDTLS_HAVE_ASM // just make things a bit faster
#define MBEDTLS_ECP_NIST_OPTIM // faster and less allocations
//#define MBEDTLS_ECP_WINDOW_SIZE 4
//#define MBEDTLS_ECP_FIXED_POINT_OPTIM 1
EOF
for F in 0 1; do
for W in 2 3 4; do
scripts/config.py set MBEDTLS_ECP_WINDOW_SIZE $W
scripts/config.py set MBEDTLS_ECP_FIXED_POINT_OPTIM $F
make benchmark >/dev/null 2>&1
echo "fixed point optim = $F, max window size = $W"
echo "--------------------------------------------"
programs/test/benchmark ecdh ecdsa
done
done
# cleanup
mv $CONFIG_BAK $CONFIG_H
make clean

237
externals/mbedtls/scripts/ecp_comb_table.py vendored Executable file
View file

@ -0,0 +1,237 @@
#!/usr/bin/env python3
"""
Purpose
This script dumps comb table of ec curve. When you add a new ec curve, you
can use this script to generate codes to define `<curve>_T` in ecp_curves.c
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
import os
import subprocess
import sys
import tempfile
HOW_TO_ADD_NEW_CURVE = """
If you are trying to add new curve, you can follow these steps:
1. Define curve parameters (<curve>_p, <curve>_gx, etc...) in ecp_curves.c.
2. Add a macro to define <curve>_T to NULL following these parameters.
3. Build mbedcrypto
4. Run this script with an argument of new curve
5. Copy the output of this script into ecp_curves.c and replace the macro added
in Step 2
6. Rebuild and test if everything is ok
Replace the <curve> in the above with the name of the curve you want to add."""
CC = os.getenv('CC', 'cc')
MBEDTLS_LIBRARY_PATH = os.getenv('MBEDTLS_LIBRARY_PATH', "library")
SRC_DUMP_COMB_TABLE = r'''
#include <stdio.h>
#include <stdlib.h>
#include "mbedtls/ecp.h"
#include "mbedtls/error.h"
static void dump_mpi_initialize( const char *name, const mbedtls_mpi *d )
{
uint8_t buf[128] = {0};
size_t olen;
uint8_t *p;
olen = mbedtls_mpi_size( d );
mbedtls_mpi_write_binary_le( d, buf, olen );
printf("static const mbedtls_mpi_uint %s[] = {\n", name);
for (p = buf; p < buf + olen; p += 8) {
printf( " BYTES_TO_T_UINT_8( 0x%02X, 0x%02X, 0x%02X, 0x%02X, 0x%02X, 0x%02X, 0x%02X, 0x%02X ),\n",
p[0], p[1], p[2], p[3], p[4], p[5], p[6], p[7] );
}
printf("};\n");
}
static void dump_T( const mbedtls_ecp_group *grp )
{
char name[128];
printf( "#if MBEDTLS_ECP_FIXED_POINT_OPTIM == 1\n" );
for (size_t i = 0; i < grp->T_size; ++i) {
snprintf( name, sizeof(name), "%s_T_%zu_X", CURVE_NAME, i );
dump_mpi_initialize( name, &grp->T[i].X );
snprintf( name, sizeof(name), "%s_T_%zu_Y", CURVE_NAME, i );
dump_mpi_initialize( name, &grp->T[i].Y );
}
printf( "static const mbedtls_ecp_point %s_T[%zu] = {\n", CURVE_NAME, grp->T_size );
size_t olen;
for (size_t i = 0; i < grp->T_size; ++i) {
int z;
if ( mbedtls_mpi_cmp_int(&grp->T[i].Z, 0) == 0 ) {
z = 0;
} else if ( mbedtls_mpi_cmp_int(&grp->T[i].Z, 1) == 0 ) {
z = 1;
} else {
fprintf( stderr, "Unexpected value of Z (i = %d)\n", (int)i );
exit( 1 );
}
printf( " ECP_POINT_INIT_XY_Z%d(%s_T_%zu_X, %s_T_%zu_Y),\n",
z,
CURVE_NAME, i,
CURVE_NAME, i
);
}
printf("};\n#endif\n\n");
}
int main()
{
int rc;
mbedtls_mpi m;
mbedtls_ecp_point R;
mbedtls_ecp_group grp;
mbedtls_ecp_group_init( &grp );
rc = mbedtls_ecp_group_load( &grp, CURVE_ID );
if (rc != 0) {
char buf[100];
mbedtls_strerror( rc, buf, sizeof(buf) );
fprintf( stderr, "mbedtls_ecp_group_load: %s (-0x%x)\n", buf, -rc );
return 1;
}
grp.T = NULL;
mbedtls_ecp_point_init( &R );
mbedtls_mpi_init( &m);
mbedtls_mpi_lset( &m, 1 );
rc = mbedtls_ecp_mul( &grp, &R, &m, &grp.G, NULL, NULL );
if ( rc != 0 ) {
char buf[100];
mbedtls_strerror( rc, buf, sizeof(buf) );
fprintf( stderr, "mbedtls_ecp_mul: %s (-0x%x)\n", buf, -rc );
return 1;
}
if ( grp.T == NULL ) {
fprintf( stderr, "grp.T is not generated. Please make sure"
"MBEDTLS_ECP_FIXED_POINT_OPTIM is enabled in mbedtls_config.h\n" );
return 1;
}
dump_T( &grp );
return 0;
}
'''
SRC_DUMP_KNOWN_CURVE = r'''
#include <stdio.h>
#include <stdlib.h>
#include "mbedtls/ecp.h"
int main() {
const mbedtls_ecp_curve_info *info = mbedtls_ecp_curve_list();
mbedtls_ecp_group grp;
mbedtls_ecp_group_init( &grp );
while ( info->name != NULL ) {
mbedtls_ecp_group_load( &grp, info->grp_id );
if ( mbedtls_ecp_get_type(&grp) == MBEDTLS_ECP_TYPE_SHORT_WEIERSTRASS ) {
printf( " %s", info->name );
}
info++;
}
printf( "\n" );
return 0;
}
'''
def join_src_path(*args):
return os.path.normpath(os.path.join(os.path.dirname(__file__), "..", *args))
def run_c_source(src, cflags):
"""
Compile and run C source code
:param src: the c language code to run
:param cflags: additional cflags passing to compiler
:return:
"""
binname = tempfile.mktemp(prefix="mbedtls")
fd, srcname = tempfile.mkstemp(prefix="mbedtls", suffix=".c")
srcfile = os.fdopen(fd, mode="w")
srcfile.write(src)
srcfile.close()
args = [CC,
*cflags,
'-I' + join_src_path("include"),
"-o", binname,
'-L' + MBEDTLS_LIBRARY_PATH,
srcname,
'-lmbedcrypto']
p = subprocess.run(args=args, check=False)
if p.returncode != 0:
return False
p = subprocess.run(args=[binname], check=False, env={
'LD_LIBRARY_PATH': MBEDTLS_LIBRARY_PATH
})
if p.returncode != 0:
return False
os.unlink(srcname)
os.unlink(binname)
return True
def compute_curve(curve):
"""compute comb table for curve"""
r = run_c_source(
SRC_DUMP_COMB_TABLE,
[
'-g',
'-DCURVE_ID=MBEDTLS_ECP_DP_%s' % curve.upper(),
'-DCURVE_NAME="%s"' % curve.lower(),
])
if not r:
print("""\
Unable to compile and run utility.""", file=sys.stderr)
sys.exit(1)
def usage():
print("""
Usage: python %s <curve>...
Arguments:
curve Specify one or more curve names (e.g secp256r1)
All possible curves: """ % sys.argv[0])
run_c_source(SRC_DUMP_KNOWN_CURVE, [])
print("""
Environment Variable:
CC Specify which c compile to use to compile utility.
MBEDTLS_LIBRARY_PATH
Specify the path to mbedcrypto library. (e.g. build/library/)
How to add a new curve: %s""" % HOW_TO_ADD_NEW_CURVE)
def run_main():
shared_lib_path = os.path.normpath(os.path.join(MBEDTLS_LIBRARY_PATH, "libmbedcrypto.so"))
static_lib_path = os.path.normpath(os.path.join(MBEDTLS_LIBRARY_PATH, "libmbedcrypto.a"))
if not os.path.exists(shared_lib_path) and not os.path.exists(static_lib_path):
print("Warning: both '%s' and '%s' are not exists. This script will use "
"the library from your system instead of the library compiled by "
"this source directory.\n"
"You can specify library path using environment variable "
"'MBEDTLS_LIBRARY_PATH'." % (shared_lib_path, static_lib_path),
file=sys.stderr)
if len(sys.argv) <= 1:
usage()
else:
for curve in sys.argv[1:]:
compute_curve(curve)
if __name__ == '__main__':
run_main()

108
externals/mbedtls/scripts/footprint.sh vendored Executable file
View file

@ -0,0 +1,108 @@
#!/bin/sh
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
# Purpose
#
# This script determines ROM size (or code size) for the standard Mbed TLS
# configurations, when built for a Cortex M3/M4 target.
#
# Configurations included:
# default include/mbedtls/mbedtls_config.h
# thread configs/config-thread.h
# suite-b configs/config-suite-b.h
# psk configs/config-ccm-psk-tls1_2.h
#
# Usage: footprint.sh
#
set -eu
CONFIG_H='include/mbedtls/mbedtls_config.h'
if [ -r $CONFIG_H ]; then :; else
echo "$CONFIG_H not found" >&2
echo "This script needs to be run from the root of" >&2
echo "a git checkout or uncompressed tarball" >&2
exit 1
fi
if grep -i cmake Makefile >/dev/null; then
echo "Not compatible with CMake" >&2
exit 1
fi
if which arm-none-eabi-gcc >/dev/null 2>&1; then :; else
echo "You need the ARM-GCC toolchain in your path" >&2
echo "See https://launchpad.net/gcc-arm-embedded/" >&2
exit 1
fi
ARMGCC_FLAGS='-Os -march=armv7-m -mthumb'
OUTFILE='00-footprint-summary.txt'
log()
{
echo "$@"
echo "$@" >> "$OUTFILE"
}
doit()
{
NAME="$1"
FILE="$2"
log ""
log "$NAME ($FILE):"
cp $CONFIG_H ${CONFIG_H}.bak
if [ "$FILE" != $CONFIG_H ]; then
cp "$FILE" $CONFIG_H
fi
{
scripts/config.py unset MBEDTLS_NET_C || true
scripts/config.py unset MBEDTLS_TIMING_C || true
scripts/config.py unset MBEDTLS_FS_IO || true
scripts/config.py --force set MBEDTLS_NO_PLATFORM_ENTROPY || true
} >/dev/null 2>&1
make clean >/dev/null
CC=arm-none-eabi-gcc AR=arm-none-eabi-ar LD=arm-none-eabi-ld \
CFLAGS="$ARMGCC_FLAGS" make lib >/dev/null
OUT="size-${NAME}.txt"
arm-none-eabi-size -t library/libmbed*.a > "$OUT"
log "$( head -n1 "$OUT" )"
log "$( tail -n1 "$OUT" )"
cp ${CONFIG_H}.bak $CONFIG_H
}
# truncate the file just this time
echo "(generated by $0)" > "$OUTFILE"
echo "" >> "$OUTFILE"
log "Footprint of standard configurations (minus net_sockets.c, timing.c, fs_io)"
log "for bare-metal ARM Cortex-M3/M4 microcontrollers."
VERSION_H="include/mbedtls/version.h"
MBEDTLS_VERSION=$( sed -n 's/.*VERSION_STRING *"\(.*\)"/\1/p' $VERSION_H )
if git rev-parse HEAD >/dev/null; then
GIT_HEAD=$( git rev-parse HEAD | head -c 10 )
GIT_VERSION=" (git head: $GIT_HEAD)"
else
GIT_VERSION=""
fi
log ""
log "Mbed TLS $MBEDTLS_VERSION$GIT_VERSION"
log "$( arm-none-eabi-gcc --version | head -n1 )"
log "CFLAGS=$ARMGCC_FLAGS"
doit default include/mbedtls/mbedtls_config.h
doit thread configs/config-thread.h
doit suite-b configs/config-suite-b.h
doit psk configs/config-ccm-psk-tls1_2.h
zip mbedtls-footprint.zip "$OUTFILE" size-*.txt >/dev/null

View file

@ -0,0 +1,212 @@
#!/usr/bin/env python3
"""Generate library/psa_crypto_driver_wrappers.h
library/psa_crypto_driver_wrappers_no_static.c
This module is invoked by the build scripts to auto generate the
psa_crypto_driver_wrappers.h and psa_crypto_driver_wrappers_no_static
based on template files in script/data_files/driver_templates/.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
import sys
import os
import json
from typing import NewType, Dict, Any
from traceback import format_tb
import argparse
import jsonschema
import jinja2
from mbedtls_dev import build_tree
JSONSchema = NewType('JSONSchema', object)
# The Driver is an Object, but practically it's indexable and can called a dictionary to
# keep MyPy happy till MyPy comes with a more composite type for JsonObjects.
Driver = NewType('Driver', dict)
class JsonValidationException(Exception):
def __init__(self, message="Json Validation Failed"):
self.message = message
super().__init__(self.message)
class DriverReaderException(Exception):
def __init__(self, message="Driver Reader Failed"):
self.message = message
super().__init__(self.message)
def render(template_path: str, driver_jsoncontext: list) -> str:
"""
Render template from the input file and driver JSON.
"""
environment = jinja2.Environment(
loader=jinja2.FileSystemLoader(os.path.dirname(template_path)),
keep_trailing_newline=True)
template = environment.get_template(os.path.basename(template_path))
return template.render(drivers=driver_jsoncontext)
def generate_driver_wrapper_file(template_dir: str,
output_dir: str,
template_file_name: str,
driver_jsoncontext: list) -> None:
"""
Generate the file psa_crypto_driver_wrapper.c.
"""
driver_wrapper_template_filename = \
os.path.join(template_dir, template_file_name)
result = render(driver_wrapper_template_filename, driver_jsoncontext)
with open(file=os.path.join(output_dir, os.path.splitext(template_file_name)[0]),
mode='w',
encoding='UTF-8') as out_file:
out_file.write(result)
def validate_json(driverjson_data: Driver, driverschema_list: dict) -> None:
"""
Validate the Driver JSON against an appropriate schema
the schema passed could be that matching an opaque/ transparent driver.
"""
driver_type = driverjson_data["type"]
driver_prefix = driverjson_data["prefix"]
try:
_schema = driverschema_list[driver_type]
jsonschema.validate(instance=driverjson_data, schema=_schema)
except KeyError as err:
# This could happen if the driverjson_data.type does not exist in the provided schema list
# schemas = {'transparent': transparent_driver_schema, 'opaque': opaque_driver_schema}
# Print onto stdout and stderr.
print("Unknown Driver type " + driver_type +
" for driver " + driver_prefix, str(err))
print("Unknown Driver type " + driver_type +
" for driver " + driver_prefix, str(err), file=sys.stderr)
raise JsonValidationException() from err
except jsonschema.exceptions.ValidationError as err:
# Print onto stdout and stderr.
print("Error: Failed to validate data file: {} using schema: {}."
"\n Exception Message: \"{}\""
" ".format(driverjson_data, _schema, str(err)))
print("Error: Failed to validate data file: {} using schema: {}."
"\n Exception Message: \"{}\""
" ".format(driverjson_data, _schema, str(err)), file=sys.stderr)
raise JsonValidationException() from err
def load_driver(schemas: Dict[str, Any], driver_file: str) -> Any:
"""loads validated json driver"""
with open(file=driver_file, mode='r', encoding='UTF-8') as f:
json_data = json.load(f)
try:
validate_json(json_data, schemas)
except JsonValidationException as e:
raise DriverReaderException from e
return json_data
def load_schemas(project_root: str) -> Dict[str, Any]:
"""
Load schemas map
"""
schema_file_paths = {
'transparent': os.path.join(project_root,
'scripts',
'data_files',
'driver_jsons',
'driver_transparent_schema.json'),
'opaque': os.path.join(project_root,
'scripts',
'data_files',
'driver_jsons',
'driver_opaque_schema.json')
}
driver_schema = {}
for key, file_path in schema_file_paths.items():
with open(file=file_path, mode='r', encoding='UTF-8') as file:
driver_schema[key] = json.load(file)
return driver_schema
def read_driver_descriptions(project_root: str,
json_directory: str,
jsondriver_list: str) -> list:
"""
Merge driver JSON files into a single ordered JSON after validation.
"""
driver_schema = load_schemas(project_root)
with open(file=os.path.join(json_directory, jsondriver_list),
mode='r',
encoding='UTF-8') as driver_list_file:
driver_list = json.load(driver_list_file)
return [load_driver(schemas=driver_schema,
driver_file=os.path.join(json_directory, driver_file_name))
for driver_file_name in driver_list]
def trace_exception(e: Exception, file=sys.stderr) -> None:
"""Prints exception trace to the given TextIO handle"""
print("Exception: type: %s, message: %s, trace: %s" % (
e.__class__, str(e), format_tb(e.__traceback__)
), file)
TEMPLATE_FILENAMES = ["psa_crypto_driver_wrappers.h.jinja",
"psa_crypto_driver_wrappers_no_static.c.jinja"]
def main() -> int:
"""
Main with command line arguments.
"""
def_arg_project_root = build_tree.guess_project_root()
parser = argparse.ArgumentParser()
parser.add_argument('--project-root', default=def_arg_project_root,
help='root directory of repo source code')
parser.add_argument('--template-dir',
help='directory holding the driver templates')
parser.add_argument('--json-dir',
help='directory holding the driver JSONs')
parser.add_argument('output_directory', nargs='?',
help='output file\'s location')
args = parser.parse_args()
project_root = os.path.abspath(args.project_root)
crypto_core_directory = build_tree.crypto_core_directory(project_root)
output_directory = args.output_directory if args.output_directory is not None else \
crypto_core_directory
template_directory = args.template_dir if args.template_dir is not None else \
os.path.join(project_root,
'scripts',
'data_files',
'driver_templates')
json_directory = args.json_dir if args.json_dir is not None else \
os.path.join(project_root,
'scripts',
'data_files',
'driver_jsons')
try:
# Read and validate list of driver jsons from driverlist.json
merged_driver_json = read_driver_descriptions(project_root,
json_directory,
'driverlist.json')
except DriverReaderException as e:
trace_exception(e)
return 1
for template_filename in TEMPLATE_FILENAMES:
generate_driver_wrapper_file(template_directory, output_directory,
template_filename, merged_driver_json)
return 0
if __name__ == '__main__':
sys.exit(main())

240
externals/mbedtls/scripts/generate_errors.pl vendored Executable file
View file

@ -0,0 +1,240 @@
#!/usr/bin/env perl
# Generate error.c
#
# Usage: ./generate_errors.pl or scripts/generate_errors.pl without arguments,
# or generate_errors.pl include_dir data_dir error_file
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
use strict;
use warnings;
my ($include_dir, $data_dir, $error_file);
if( @ARGV ) {
die "Invalid number of arguments" if scalar @ARGV != 3;
($include_dir, $data_dir, $error_file) = @ARGV;
-d $include_dir or die "No such directory: $include_dir\n";
-d $data_dir or die "No such directory: $data_dir\n";
} else {
$include_dir = 'include/mbedtls';
$data_dir = 'scripts/data_files';
$error_file = 'library/error.c';
unless( -d $include_dir && -d $data_dir ) {
chdir '..' or die;
-d $include_dir && -d $data_dir
or die "Without arguments, must be run from root or scripts\n"
}
}
my $error_format_file = $data_dir.'/error.fmt';
my @low_level_modules = qw( AES ARIA ASN1 BASE64 BIGNUM
CAMELLIA CCM CHACHA20 CHACHAPOLY CMAC CTR_DRBG DES
ENTROPY ERROR GCM HKDF HMAC_DRBG LMS MD5
NET OID PADLOCK PBKDF2 PLATFORM POLY1305 RIPEMD160
SHA1 SHA256 SHA512 SHA3 THREADING );
my @high_level_modules = qw( CIPHER DHM ECP MD
PEM PK PKCS12 PKCS5
RSA SSL X509 PKCS7 );
undef $/;
open(FORMAT_FILE, '<:crlf', "$error_format_file") or die "Opening error format file '$error_format_file': $!";
my $error_format = <FORMAT_FILE>;
close(FORMAT_FILE);
my @files = glob qq("$include_dir/*.h");
my @necessary_include_files;
my @matches;
foreach my $file (@files) {
open(FILE, '<:crlf', $file) or die("$0: $file: $!");
my $content = <FILE>;
close FILE;
my $found = 0;
while ($content =~ m[
# Both the before-comment and the after-comment are optional.
# Only the comment content is a regex capture group. The comment
# start and end parts are outside the capture group.
(?:/\*[*!](?!<) # Doxygen before-comment start
((?:[^*]|\*+[^*/])*) # $1: Comment content (no */ inside)
\*/)? # Comment end
\s*\#\s*define\s+(MBEDTLS_ERR_\w+) # $2: name
\s+\-(0[Xx][0-9A-Fa-f]+)\s* # $3: value (without the sign)
(?:/\*[*!]< # Doxygen after-comment start
((?:[^*]|\*+[^*/])*) # $4: Comment content (no */ inside)
\*/)? # Comment end
]gsx) {
my ($before, $name, $value, $after) = ($1, $2, $3, $4);
# Discard Doxygen comments that are coincidentally present before
# an error definition but not attached to it. This is ad hoc, based
# on what actually matters (or mattered at some point).
undef $before if defined($before) && $before =~ /\s*\\name\s/s;
die "Description neither before nor after $name in $file\n"
if !defined($before) && !defined($after);
die "Description both before and after $name in $file\n"
if defined($before) && defined($after);
my $description = (defined($before) ? $before : $after);
$description =~ s/^\s+//;
$description =~ s/\n( *\*)? */ /g;
$description =~ s/\.?\s+$//;
push @matches, [$name, $value, $description];
++$found;
}
if ($found) {
my $include_name = $file;
$include_name =~ s!.*/!!;
push @necessary_include_files, $include_name;
}
}
my $ll_old_define = "";
my $hl_old_define = "";
my $ll_code_check = "";
my $hl_code_check = "";
my $headers = "";
my %included_headers;
my %error_codes_seen;
foreach my $match (@matches)
{
my ($error_name, $error_code, $description) = @$match;
die "Duplicated error code: $error_code ($error_name)\n"
if( $error_codes_seen{$error_code}++ );
$description =~ s/\\/\\\\/g;
my ($module_name) = $error_name =~ /^MBEDTLS_ERR_([^_]+)/;
# Fix faulty ones
$module_name = "BIGNUM" if ($module_name eq "MPI");
$module_name = "CTR_DRBG" if ($module_name eq "CTR");
$module_name = "HMAC_DRBG" if ($module_name eq "HMAC");
my $define_name = $module_name;
$define_name = "X509_USE,X509_CREATE" if ($define_name eq "X509");
$define_name = "ASN1_PARSE" if ($define_name eq "ASN1");
$define_name = "SSL_TLS" if ($define_name eq "SSL");
$define_name = "PEM_PARSE,PEM_WRITE" if ($define_name eq "PEM");
$define_name = "PKCS7" if ($define_name eq "PKCS7");
my $include_name = $module_name;
$include_name =~ tr/A-Z/a-z/;
# Fix faulty ones
$include_name = "net_sockets" if ($module_name eq "NET");
$included_headers{"${include_name}.h"} = $module_name;
my $found_ll = grep $_ eq $module_name, @low_level_modules;
my $found_hl = grep $_ eq $module_name, @high_level_modules;
if (!$found_ll && !$found_hl)
{
printf("Error: Do not know how to handle: $module_name\n");
exit 1;
}
my $code_check;
my $old_define;
my $white_space;
my $first;
if ($found_ll)
{
$code_check = \$ll_code_check;
$old_define = \$ll_old_define;
$white_space = ' ';
}
else
{
$code_check = \$hl_code_check;
$old_define = \$hl_old_define;
$white_space = ' ';
}
if ($define_name ne ${$old_define})
{
if (${$old_define} ne "")
{
${$code_check} .= "#endif /* ";
$first = 0;
foreach my $dep (split(/,/, ${$old_define}))
{
${$code_check} .= " || " if ($first++);
${$code_check} .= "MBEDTLS_${dep}_C";
}
${$code_check} .= " */\n\n";
}
${$code_check} .= "#if ";
$headers .= "#if " if ($include_name ne "");
$first = 0;
foreach my $dep (split(/,/, ${define_name}))
{
${$code_check} .= " || " if ($first);
$headers .= " || " if ($first++);
${$code_check} .= "defined(MBEDTLS_${dep}_C)";
$headers .= "defined(MBEDTLS_${dep}_C)" if
($include_name ne "");
}
${$code_check} .= "\n";
$headers .= "\n#include \"mbedtls/${include_name}.h\"\n".
"#endif\n\n" if ($include_name ne "");
${$old_define} = $define_name;
}
${$code_check} .= "${white_space}case -($error_name):\n".
"${white_space} return( \"$module_name - $description\" );\n"
};
if ($ll_old_define ne "")
{
$ll_code_check .= "#endif /* ";
my $first = 0;
foreach my $dep (split(/,/, $ll_old_define))
{
$ll_code_check .= " || " if ($first++);
$ll_code_check .= "MBEDTLS_${dep}_C";
}
$ll_code_check .= " */\n";
}
if ($hl_old_define ne "")
{
$hl_code_check .= "#endif /* ";
my $first = 0;
foreach my $dep (split(/,/, $hl_old_define))
{
$hl_code_check .= " || " if ($first++);
$hl_code_check .= "MBEDTLS_${dep}_C";
}
$hl_code_check .= " */\n";
}
$error_format =~ s/HEADER_INCLUDED\n/$headers/g;
$error_format =~ s/LOW_LEVEL_CODE_CHECKS\n/$ll_code_check/g;
$error_format =~ s/HIGH_LEVEL_CODE_CHECKS\n/$hl_code_check/g;
open(ERROR_FILE, ">$error_file") or die "Opening destination file '$error_file': $!";
print ERROR_FILE $error_format;
close(ERROR_FILE);
my $errors = 0;
for my $include_name (@necessary_include_files)
{
if (not $included_headers{$include_name})
{
print STDERR "The header file \"$include_name\" defines error codes but has not been included!\n";
++$errors;
}
}
exit !!$errors;

View file

@ -0,0 +1,79 @@
#!/usr/bin/env perl
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
use strict;
my ($include_dir, $data_dir, $feature_file);
if( @ARGV ) {
die "Invalid number of arguments" if scalar @ARGV != 3;
($include_dir, $data_dir, $feature_file) = @ARGV;
-d $include_dir or die "No such directory: $include_dir\n";
-d $data_dir or die "No such directory: $data_dir\n";
} else {
$include_dir = 'include/mbedtls';
$data_dir = 'scripts/data_files';
$feature_file = 'library/version_features.c';
unless( -d $include_dir && -d $data_dir ) {
chdir '..' or die;
-d $include_dir && -d $data_dir
or die "Without arguments, must be run from root or scripts\n"
}
}
my $feature_format_file = $data_dir.'/version_features.fmt';
my @sections = ( "System support", "Mbed TLS modules",
"Mbed TLS feature support" );
my $line_separator = $/;
undef $/;
open(FORMAT_FILE, '<:crlf', "$feature_format_file") or die "Opening feature format file '$feature_format_file': $!";
my $feature_format = <FORMAT_FILE>;
close(FORMAT_FILE);
$/ = $line_separator;
open(CONFIG_H, '<:crlf', "$include_dir/mbedtls_config.h") || die("Failure when opening mbedtls_config.h: $!");
my $feature_defines = "";
my $in_section = 0;
while (my $line = <CONFIG_H>)
{
next if ($in_section && $line !~ /#define/ && $line !~ /SECTION/);
next if (!$in_section && $line !~ /SECTION/);
if ($in_section) {
if ($line =~ /SECTION/) {
$in_section = 0;
next;
}
# Strip leading MBEDTLS_ to save binary size
my ($mbedtls_prefix, $define) = $line =~ /#define (MBEDTLS_)?(\w+)/;
if (!$mbedtls_prefix) {
die "Feature does not start with 'MBEDTLS_': $line\n";
}
$feature_defines .= "#if defined(MBEDTLS_${define})\n";
$feature_defines .= " \"${define}\", //no-check-names\n";
$feature_defines .= "#endif /* MBEDTLS_${define} */\n";
}
if (!$in_section) {
my ($section_name) = $line =~ /SECTION: ([\w ]+)/;
my $found_section = grep $_ eq $section_name, @sections;
$in_section = 1 if ($found_section);
}
};
$feature_format =~ s/FEATURE_DEFINES\n/$feature_defines/g;
open(ERROR_FILE, ">$feature_file") or die "Opening destination file '$feature_file': $!";
print ERROR_FILE $feature_format;
close(ERROR_FILE);

View file

@ -0,0 +1,332 @@
#!/usr/bin/env python3
"""Generate psa_constant_names_generated.c
which is included by programs/psa/psa_constant_names.c.
The code generated by this module is only meant to be used in the context
of that program.
An argument passed to this script will modify the output directory where the
file is written:
* by default (no arguments passed): writes to programs/psa/
* OUTPUT_FILE_DIR passed: writes to OUTPUT_FILE_DIR/
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
import os
import sys
from mbedtls_dev import build_tree
from mbedtls_dev import macro_collector
OUTPUT_TEMPLATE = '''\
/* Automatically generated by generate_psa_constant.py. DO NOT EDIT. */
static const char *psa_strerror(psa_status_t status)
{
switch (status) {
%(status_cases)s
default: return NULL;
}
}
static const char *psa_ecc_family_name(psa_ecc_family_t curve)
{
switch (curve) {
%(ecc_curve_cases)s
default: return NULL;
}
}
static const char *psa_dh_family_name(psa_dh_family_t group)
{
switch (group) {
%(dh_group_cases)s
default: return NULL;
}
}
static const char *psa_hash_algorithm_name(psa_algorithm_t hash_alg)
{
switch (hash_alg) {
%(hash_algorithm_cases)s
default: return NULL;
}
}
static const char *psa_ka_algorithm_name(psa_algorithm_t ka_alg)
{
switch (ka_alg) {
%(ka_algorithm_cases)s
default: return NULL;
}
}
static int psa_snprint_key_type(char *buffer, size_t buffer_size,
psa_key_type_t type)
{
size_t required_size = 0;
switch (type) {
%(key_type_cases)s
default:
%(key_type_code)s{
return snprintf(buffer, buffer_size,
"0x%%04x", (unsigned) type);
}
break;
}
buffer[0] = 0;
return (int) required_size;
}
#define NO_LENGTH_MODIFIER 0xfffffffflu
static int psa_snprint_algorithm(char *buffer, size_t buffer_size,
psa_algorithm_t alg)
{
size_t required_size = 0;
psa_algorithm_t core_alg = alg;
unsigned long length_modifier = NO_LENGTH_MODIFIER;
if (PSA_ALG_IS_MAC(alg)) {
core_alg = PSA_ALG_TRUNCATED_MAC(alg, 0);
if (alg & PSA_ALG_MAC_AT_LEAST_THIS_LENGTH_FLAG) {
append(&buffer, buffer_size, &required_size,
"PSA_ALG_AT_LEAST_THIS_LENGTH_MAC(", 33);
length_modifier = PSA_MAC_TRUNCATED_LENGTH(alg);
} else if (core_alg != alg) {
append(&buffer, buffer_size, &required_size,
"PSA_ALG_TRUNCATED_MAC(", 22);
length_modifier = PSA_MAC_TRUNCATED_LENGTH(alg);
}
} else if (PSA_ALG_IS_AEAD(alg)) {
core_alg = PSA_ALG_AEAD_WITH_DEFAULT_LENGTH_TAG(alg);
if (core_alg == 0) {
/* For unknown AEAD algorithms, there is no "default tag length". */
core_alg = alg;
} else if (alg & PSA_ALG_AEAD_AT_LEAST_THIS_LENGTH_FLAG) {
append(&buffer, buffer_size, &required_size,
"PSA_ALG_AEAD_WITH_AT_LEAST_THIS_LENGTH_TAG(", 43);
length_modifier = PSA_ALG_AEAD_GET_TAG_LENGTH(alg);
} else if (core_alg != alg) {
append(&buffer, buffer_size, &required_size,
"PSA_ALG_AEAD_WITH_SHORTENED_TAG(", 32);
length_modifier = PSA_ALG_AEAD_GET_TAG_LENGTH(alg);
}
} else if (PSA_ALG_IS_KEY_AGREEMENT(alg) &&
!PSA_ALG_IS_RAW_KEY_AGREEMENT(alg)) {
core_alg = PSA_ALG_KEY_AGREEMENT_GET_KDF(alg);
append(&buffer, buffer_size, &required_size,
"PSA_ALG_KEY_AGREEMENT(", 22);
append_with_alg(&buffer, buffer_size, &required_size,
psa_ka_algorithm_name,
PSA_ALG_KEY_AGREEMENT_GET_BASE(alg));
append(&buffer, buffer_size, &required_size, ", ", 2);
}
switch (core_alg) {
%(algorithm_cases)s
default:
%(algorithm_code)s{
append_integer(&buffer, buffer_size, &required_size,
"0x%%08lx", (unsigned long) core_alg);
}
break;
}
if (core_alg != alg) {
if (length_modifier != NO_LENGTH_MODIFIER) {
append(&buffer, buffer_size, &required_size, ", ", 2);
append_integer(&buffer, buffer_size, &required_size,
"%%lu", length_modifier);
}
append(&buffer, buffer_size, &required_size, ")", 1);
}
buffer[0] = 0;
return (int) required_size;
}
static int psa_snprint_key_usage(char *buffer, size_t buffer_size,
psa_key_usage_t usage)
{
size_t required_size = 0;
if (usage == 0) {
if (buffer_size > 1) {
buffer[0] = '0';
buffer[1] = 0;
} else if (buffer_size == 1) {
buffer[0] = 0;
}
return 1;
}
%(key_usage_code)s
if (usage != 0) {
if (required_size != 0) {
append(&buffer, buffer_size, &required_size, " | ", 3);
}
append_integer(&buffer, buffer_size, &required_size,
"0x%%08lx", (unsigned long) usage);
} else {
buffer[0] = 0;
}
return (int) required_size;
}
/* End of automatically generated file. */
'''
KEY_TYPE_FROM_CURVE_TEMPLATE = '''if (%(tester)s(type)) {
append_with_curve(&buffer, buffer_size, &required_size,
"%(builder)s", %(builder_length)s,
PSA_KEY_TYPE_ECC_GET_FAMILY(type));
} else '''
KEY_TYPE_FROM_GROUP_TEMPLATE = '''if (%(tester)s(type)) {
append_with_group(&buffer, buffer_size, &required_size,
"%(builder)s", %(builder_length)s,
PSA_KEY_TYPE_DH_GET_FAMILY(type));
} else '''
ALGORITHM_FROM_HASH_TEMPLATE = '''if (%(tester)s(core_alg)) {
append(&buffer, buffer_size, &required_size,
"%(builder)s(", %(builder_length)s + 1);
append_with_alg(&buffer, buffer_size, &required_size,
psa_hash_algorithm_name,
PSA_ALG_GET_HASH(core_alg));
append(&buffer, buffer_size, &required_size, ")", 1);
} else '''
BIT_TEST_TEMPLATE = '''\
if (%(var)s & %(flag)s) {
if (required_size != 0) {
append(&buffer, buffer_size, &required_size, " | ", 3);
}
append(&buffer, buffer_size, &required_size, "%(flag)s", %(length)d);
%(var)s ^= %(flag)s;
}\
'''
class CaseBuilder(macro_collector.PSAMacroCollector):
"""Collect PSA crypto macro definitions and write value recognition functions.
1. Call `read_file` on the input header file(s).
2. Call `write_file` to write ``psa_constant_names_generated.c``.
"""
def __init__(self):
super().__init__(include_intermediate=True)
@staticmethod
def _make_return_case(name):
return 'case %(name)s: return "%(name)s";' % {'name': name}
@staticmethod
def _make_append_case(name):
template = ('case %(name)s: '
'append(&buffer, buffer_size, &required_size, "%(name)s", %(length)d); '
'break;')
return template % {'name': name, 'length': len(name)}
@staticmethod
def _make_bit_test(var, flag):
return BIT_TEST_TEMPLATE % {'var': var,
'flag': flag,
'length': len(flag)}
def _make_status_cases(self):
return '\n '.join(map(self._make_return_case,
sorted(self.statuses)))
def _make_ecc_curve_cases(self):
return '\n '.join(map(self._make_return_case,
sorted(self.ecc_curves)))
def _make_dh_group_cases(self):
return '\n '.join(map(self._make_return_case,
sorted(self.dh_groups)))
def _make_key_type_cases(self):
return '\n '.join(map(self._make_append_case,
sorted(self.key_types)))
@staticmethod
def _make_key_type_from_curve_code(builder, tester):
return KEY_TYPE_FROM_CURVE_TEMPLATE % {'builder': builder,
'builder_length': len(builder),
'tester': tester}
@staticmethod
def _make_key_type_from_group_code(builder, tester):
return KEY_TYPE_FROM_GROUP_TEMPLATE % {'builder': builder,
'builder_length': len(builder),
'tester': tester}
def _make_ecc_key_type_code(self):
d = self.key_types_from_curve
make = self._make_key_type_from_curve_code
return ''.join([make(k, d[k]) for k in sorted(d.keys())])
def _make_dh_key_type_code(self):
d = self.key_types_from_group
make = self._make_key_type_from_group_code
return ''.join([make(k, d[k]) for k in sorted(d.keys())])
def _make_hash_algorithm_cases(self):
return '\n '.join(map(self._make_return_case,
sorted(self.hash_algorithms)))
def _make_ka_algorithm_cases(self):
return '\n '.join(map(self._make_return_case,
sorted(self.ka_algorithms)))
def _make_algorithm_cases(self):
return '\n '.join(map(self._make_append_case,
sorted(self.algorithms)))
@staticmethod
def _make_algorithm_from_hash_code(builder, tester):
return ALGORITHM_FROM_HASH_TEMPLATE % {'builder': builder,
'builder_length': len(builder),
'tester': tester}
def _make_algorithm_code(self):
d = self.algorithms_from_hash
make = self._make_algorithm_from_hash_code
return ''.join([make(k, d[k]) for k in sorted(d.keys())])
def _make_key_usage_code(self):
return '\n'.join([self._make_bit_test('usage', bit)
for bit in sorted(self.key_usage_flags)])
def write_file(self, output_file):
"""Generate the pretty-printer function code from the gathered
constant definitions.
"""
data = {}
data['status_cases'] = self._make_status_cases()
data['ecc_curve_cases'] = self._make_ecc_curve_cases()
data['dh_group_cases'] = self._make_dh_group_cases()
data['key_type_cases'] = self._make_key_type_cases()
data['key_type_code'] = (self._make_ecc_key_type_code() +
self._make_dh_key_type_code())
data['hash_algorithm_cases'] = self._make_hash_algorithm_cases()
data['ka_algorithm_cases'] = self._make_ka_algorithm_cases()
data['algorithm_cases'] = self._make_algorithm_cases()
data['algorithm_code'] = self._make_algorithm_code()
data['key_usage_code'] = self._make_key_usage_code()
output_file.write(OUTPUT_TEMPLATE % data)
def generate_psa_constants(header_file_names, output_file_name):
collector = CaseBuilder()
for header_file_name in header_file_names:
with open(header_file_name, 'rb') as header_file:
collector.read_file(header_file)
temp_file_name = output_file_name + '.tmp'
with open(temp_file_name, 'w') as output_file:
collector.write_file(output_file)
os.replace(temp_file_name, output_file_name)
if __name__ == '__main__':
build_tree.chdir_to_root()
# Allow to change the directory where psa_constant_names_generated.c is written to.
OUTPUT_FILE_DIR = sys.argv[1] if len(sys.argv) == 2 else "programs/psa"
generate_psa_constants(['include/psa/crypto_values.h',
'include/psa/crypto_extra.h'],
OUTPUT_FILE_DIR + '/psa_constant_names_generated.c')

View file

@ -0,0 +1,116 @@
#! /usr/bin/env perl
# Generate query_config.c
#
# The file query_config.c contains a C function that can be used to check if
# a configuration macro is defined and to retrieve its expansion in string
# form (if any). This facilitates querying the compile time configuration of
# the library, for example, for testing.
#
# The query_config.c is generated from the default configuration files
# include/mbedtls/mbedtls_config.h and include/psa/crypto_config.h.
# The idea is that mbedtls_config.h and crypto_config.h contain ALL the
# compile time configurations available in Mbed TLS (commented or uncommented).
# This script extracts the configuration macros from the two files and this
# information is used to automatically generate the body of the query_config()
# function by using the template in scripts/data_files/query_config.fmt.
#
# Usage: scripts/generate_query_config.pl without arguments, or
# generate_query_config.pl mbedtls_config_file psa_crypto_config_file template_file output_file
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
use strict;
my ($mbedtls_config_file, $psa_crypto_config_file, $query_config_format_file, $query_config_file);
my $default_mbedtls_config_file = "./include/mbedtls/mbedtls_config.h";
my $default_psa_crypto_config_file = "./include/psa/crypto_config.h";
my $default_query_config_format_file = "./scripts/data_files/query_config.fmt";
my $default_query_config_file = "./programs/test/query_config.c";
if( @ARGV ) {
die "Invalid number of arguments - usage: $0 [MBED_TLS_CONFIG_FILE PSA_CRYPTO_CONFIG_FILE TEMPLATE_FILE OUTPUT_FILE]" if scalar @ARGV != 4;
($mbedtls_config_file, $psa_crypto_config_file, $query_config_format_file, $query_config_file) = @ARGV;
-f $mbedtls_config_file or die "No such file: $mbedtls_config_file";
-f $psa_crypto_config_file or die "No such file: $psa_crypto_config_file";
-f $query_config_format_file or die "No such file: $query_config_format_file";
} else {
$mbedtls_config_file = $default_mbedtls_config_file;
$psa_crypto_config_file = $default_psa_crypto_config_file;
$query_config_format_file = $default_query_config_format_file;
$query_config_file = $default_query_config_file;
unless(-f $mbedtls_config_file && -f $query_config_format_file && -f $psa_crypto_config_file) {
chdir '..' or die;
-f $mbedtls_config_file && -f $query_config_format_file && -f $psa_crypto_config_file
or die "No arguments supplied, must be run from project root or a first-level subdirectory\n";
}
}
# Excluded macros from the generated query_config.c. For example, macros that
# have commas or function-like macros cannot be transformed into strings easily
# using the preprocessor, so they should be excluded or the preprocessor will
# throw errors.
my @excluded = qw(
MBEDTLS_SSL_CIPHERSUITES
);
my $excluded_re = join '|', @excluded;
# This variable will contain the string to replace in the CHECK_CONFIG of the
# format file
my $config_check = "";
my $list_config = "";
for my $config_file ($mbedtls_config_file, $psa_crypto_config_file) {
next unless defined($config_file); # we might not have been given a PSA crypto config file
open(CONFIG_FILE, "<", $config_file) or die "Opening config file '$config_file': $!";
while (my $line = <CONFIG_FILE>) {
if ($line =~ /^(\/\/)?\s*#\s*define\s+(MBEDTLS_\w+|PSA_WANT_\w+).*/) {
my $name = $2;
# Skip over the macro if it is in the excluded list
next if $name =~ /$excluded_re/;
$config_check .= <<EOT;
#if defined($name)
if( strcmp( "$name", config ) == 0 )
{
MACRO_EXPANSION_TO_STR( $name );
return( 0 );
}
#endif /* $name */
EOT
$list_config .= <<EOT;
#if defined($name)
OUTPUT_MACRO_NAME_VALUE($name);
#endif /* $name */
EOT
}
}
close(CONFIG_FILE);
}
# Read the full format file into a string
local $/;
open(FORMAT_FILE, "<", $query_config_format_file) or die "Opening query config format file '$query_config_format_file': $!";
my $query_config_format = <FORMAT_FILE>;
close(FORMAT_FILE);
# Replace the body of the query_config() function with the code we just wrote
$query_config_format =~ s/CHECK_CONFIG/$config_check/g;
$query_config_format =~ s/LIST_CONFIG/$list_config/g;
# Rewrite the query_config.c file
open(QUERY_CONFIG_FILE, ">", $query_config_file) or die "Opening destination file '$query_config_file': $!";
print QUERY_CONFIG_FILE $query_config_format;
close(QUERY_CONFIG_FILE);

View file

@ -0,0 +1,414 @@
#!/usr/bin/env python3
"""Generate library/ssl_debug_helpers_generated.c
The code generated by this module includes debug helper functions that can not be
implemented by fixed codes.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
import sys
import re
import os
import textwrap
import argparse
from mbedtls_dev import build_tree
def remove_c_comments(string):
"""
Remove C style comments from input string
"""
string_pattern = r"(?P<string>\".*?\"|\'.*?\')"
comment_pattern = r"(?P<comment>/\*.*?\*/|//[^\r\n]*$)"
pattern = re.compile(string_pattern + r'|' + comment_pattern,
re.MULTILINE | re.DOTALL)
def replacer(match):
if match.lastgroup == 'comment':
return ""
return match.group()
return pattern.sub(replacer, string)
class CondDirectiveNotMatch(Exception):
pass
def preprocess_c_source_code(source, *classes):
"""
Simple preprocessor for C source code.
Only processes condition directives without expanding them.
Yield object according to the classes input. Most match firstly
If the directive pair does not match , raise CondDirectiveNotMatch.
Assume source code does not include comments and compile pass.
"""
pattern = re.compile(r"^[ \t]*#[ \t]*" +
r"(?P<directive>(if[ \t]|ifndef[ \t]|ifdef[ \t]|else|endif))" +
r"[ \t]*(?P<param>(.*\\\n)*.*$)",
re.MULTILINE)
stack = []
def _yield_objects(s, d, p, st, end):
"""
Output matched source piece
"""
nonlocal stack
start_line, end_line = '', ''
if stack:
start_line = '#{} {}'.format(d, p)
if d == 'if':
end_line = '#endif /* {} */'.format(p)
elif d == 'ifdef':
end_line = '#endif /* defined({}) */'.format(p)
else:
end_line = '#endif /* !defined({}) */'.format(p)
has_instance = False
for cls in classes:
for instance in cls.extract(s, st, end):
if has_instance is False:
has_instance = True
yield pair_start, start_line
yield instance.span()[0], instance
if has_instance:
yield start, end_line
for match in pattern.finditer(source):
directive = match.groupdict()['directive'].strip()
param = match.groupdict()['param']
start, end = match.span()
if directive in ('if', 'ifndef', 'ifdef'):
stack.append((directive, param, start, end))
continue
if not stack:
raise CondDirectiveNotMatch()
pair_directive, pair_param, pair_start, pair_end = stack.pop()
yield from _yield_objects(source,
pair_directive,
pair_param,
pair_end,
start)
if directive == 'endif':
continue
if pair_directive == 'if':
directive = 'if'
param = "!( {} )".format(pair_param)
elif pair_directive == 'ifdef':
directive = 'ifndef'
param = pair_param
else:
directive = 'ifdef'
param = pair_param
stack.append((directive, param, start, end))
assert not stack, len(stack)
class EnumDefinition:
"""
Generate helper functions around enumeration.
Currently, it generate translation function from enum value to string.
Enum definition looks like:
[typedef] enum [prefix name] { [body] } [suffix name];
Known limitation:
- the '}' and ';' SHOULD NOT exist in different macro blocks. Like
```
enum test {
....
#if defined(A)
....
};
#else
....
};
#endif
```
"""
@classmethod
def extract(cls, source_code, start=0, end=-1):
enum_pattern = re.compile(r'enum\s*(?P<prefix_name>\w*)\s*' +
r'{\s*(?P<body>[^}]*)}' +
r'\s*(?P<suffix_name>\w*)\s*;',
re.MULTILINE | re.DOTALL)
for match in enum_pattern.finditer(source_code, start, end):
yield EnumDefinition(source_code,
span=match.span(),
group=match.groupdict())
def __init__(self, source_code, span=None, group=None):
assert isinstance(group, dict)
prefix_name = group.get('prefix_name', None)
suffix_name = group.get('suffix_name', None)
body = group.get('body', None)
assert prefix_name or suffix_name
assert body
assert span
# If suffix_name exists, it is a typedef
self._prototype = suffix_name if suffix_name else 'enum ' + prefix_name
self._name = suffix_name if suffix_name else prefix_name
self._body = body
self._source = source_code
self._span = span
def __repr__(self):
return 'Enum({},{})'.format(self._name, self._span)
def __str__(self):
return repr(self)
def span(self):
return self._span
def generate_translation_function(self):
"""
Generate function for translating value to string
"""
translation_table = []
for line in self._body.splitlines():
if line.strip().startswith('#'):
# Preprocess directive, keep it in table
translation_table.append(line.strip())
continue
if not line.strip():
continue
for field in line.strip().split(','):
if not field.strip():
continue
member = field.strip().split()[0]
translation_table.append(
'{space}case {member}:\n{space} return "{member}";'
.format(member=member, space=' '*8)
)
body = textwrap.dedent('''\
const char *{name}_str( {prototype} in )
{{
switch (in) {{
{translation_table}
default:
return "UNKNOWN_VALUE";
}}
}}
''')
body = body.format(translation_table='\n'.join(translation_table),
name=self._name,
prototype=self._prototype)
return body
class SignatureAlgorithmDefinition:
"""
Generate helper functions for signature algorithms.
It generates translation function from signature algorithm define to string.
Signature algorithm definition looks like:
#define MBEDTLS_TLS1_3_SIG_[ upper case signature algorithm ] [ value(hex) ]
Known limitation:
- the definitions SHOULD exist in same macro blocks.
"""
@classmethod
def extract(cls, source_code, start=0, end=-1):
sig_alg_pattern = re.compile(r'#define\s+(?P<name>MBEDTLS_TLS1_3_SIG_\w+)\s+' +
r'(?P<value>0[xX][0-9a-fA-F]+)$',
re.MULTILINE | re.DOTALL)
matches = list(sig_alg_pattern.finditer(source_code, start, end))
if matches:
yield SignatureAlgorithmDefinition(source_code, definitions=matches)
def __init__(self, source_code, definitions=None):
if definitions is None:
definitions = []
assert isinstance(definitions, list) and definitions
self._definitions = definitions
self._source = source_code
def __repr__(self):
return 'SigAlgs({})'.format(self._definitions[0].span())
def span(self):
return self._definitions[0].span()
def __str__(self):
"""
Generate function for translating value to string
"""
translation_table = []
for m in self._definitions:
name = m.groupdict()['name']
return_val = name[len('MBEDTLS_TLS1_3_SIG_'):].lower()
translation_table.append(
' case {}:\n return "{}";'.format(name, return_val))
body = textwrap.dedent('''\
const char *mbedtls_ssl_sig_alg_to_str( uint16_t in )
{{
switch( in )
{{
{translation_table}
}};
return "UNKNOWN";
}}''')
body = body.format(translation_table='\n'.join(translation_table))
return body
class NamedGroupDefinition:
"""
Generate helper functions for named group
It generates translation function from named group define to string.
Named group definition looks like:
#define MBEDTLS_SSL_IANA_TLS_GROUP_[ upper case named group ] [ value(hex) ]
Known limitation:
- the definitions SHOULD exist in same macro blocks.
"""
@classmethod
def extract(cls, source_code, start=0, end=-1):
named_group_pattern = re.compile(r'#define\s+(?P<name>MBEDTLS_SSL_IANA_TLS_GROUP_\w+)\s+' +
r'(?P<value>0[xX][0-9a-fA-F]+)$',
re.MULTILINE | re.DOTALL)
matches = list(named_group_pattern.finditer(source_code, start, end))
if matches:
yield NamedGroupDefinition(source_code, definitions=matches)
def __init__(self, source_code, definitions=None):
if definitions is None:
definitions = []
assert isinstance(definitions, list) and definitions
self._definitions = definitions
self._source = source_code
def __repr__(self):
return 'NamedGroup({})'.format(self._definitions[0].span())
def span(self):
return self._definitions[0].span()
def __str__(self):
"""
Generate function for translating value to string
"""
translation_table = []
for m in self._definitions:
name = m.groupdict()['name']
iana_name = name[len('MBEDTLS_SSL_IANA_TLS_GROUP_'):].lower()
translation_table.append(' case {}:\n return "{}";'.format(name, iana_name))
body = textwrap.dedent('''\
const char *mbedtls_ssl_named_group_to_str( uint16_t in )
{{
switch( in )
{{
{translation_table}
}};
return "UNKOWN";
}}''')
body = body.format(translation_table='\n'.join(translation_table))
return body
OUTPUT_C_TEMPLATE = '''\
/* Automatically generated by generate_ssl_debug_helpers.py. DO NOT EDIT. */
/**
* \\file ssl_debug_helpers_generated.c
*
* \\brief Automatically generated helper functions for debugging
*/
/*
* Copyright The Mbed TLS Contributors
* SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
*
*/
#include "common.h"
#if defined(MBEDTLS_DEBUG_C)
#include "ssl_debug_helpers.h"
{functions}
#endif /* MBEDTLS_DEBUG_C */
/* End of automatically generated file. */
'''
def generate_ssl_debug_helpers(output_directory, mbedtls_root):
"""
Generate functions of debug helps
"""
mbedtls_root = os.path.abspath(
mbedtls_root or build_tree.guess_mbedtls_root())
with open(os.path.join(mbedtls_root, 'include/mbedtls/ssl.h')) as f:
source_code = remove_c_comments(f.read())
definitions = dict()
for start, instance in preprocess_c_source_code(source_code,
EnumDefinition,
SignatureAlgorithmDefinition,
NamedGroupDefinition):
if start in definitions:
continue
if isinstance(instance, EnumDefinition):
definition = instance.generate_translation_function()
else:
definition = instance
definitions[start] = definition
function_definitions = [str(v) for _, v in sorted(definitions.items())]
if output_directory == sys.stdout:
sys.stdout.write(OUTPUT_C_TEMPLATE.format(
functions='\n'.join(function_definitions)))
else:
with open(os.path.join(output_directory, 'ssl_debug_helpers_generated.c'), 'w') as f:
f.write(OUTPUT_C_TEMPLATE.format(
functions='\n'.join(function_definitions)))
def main():
"""
Command line entry
"""
parser = argparse.ArgumentParser()
parser.add_argument('--mbedtls-root', nargs='?', default=None,
help='root directory of mbedtls source code')
parser.add_argument('output_directory', nargs='?',
default='library', help='source/header files location')
args = parser.parse_args()
generate_ssl_debug_helpers(args.output_directory, args.mbedtls_root)
return 0
if __name__ == '__main__':
sys.exit(main())

View file

@ -0,0 +1,289 @@
#!/usr/bin/env perl
# Generate main file, individual apps and solution files for
# MS Visual Studio 2013
#
# Must be run from Mbed TLS root or scripts directory.
# Takes no argument.
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
use warnings;
use strict;
use Digest::MD5 'md5_hex';
my $vsx_dir = "visualc/VS2013";
my $vsx_ext = "vcxproj";
my $vsx_app_tpl_file = "scripts/data_files/vs2013-app-template.$vsx_ext";
my $vsx_main_tpl_file = "scripts/data_files/vs2013-main-template.$vsx_ext";
my $vsx_main_file = "$vsx_dir/mbedTLS.$vsx_ext";
my $vsx_sln_tpl_file = "scripts/data_files/vs2013-sln-template.sln";
my $vsx_sln_file = "$vsx_dir/mbedTLS.sln";
my $programs_dir = 'programs';
my $mbedtls_header_dir = 'include/mbedtls';
my $psa_header_dir = 'include/psa';
my $source_dir = 'library';
my $test_source_dir = 'tests/src';
my $test_header_dir = 'tests/include/test';
my $test_drivers_header_dir = 'tests/include/test/drivers';
my $test_drivers_source_dir = 'tests/src/drivers';
my @thirdparty_header_dirs = qw(
3rdparty/everest/include/everest
);
my @thirdparty_source_dirs = qw(
3rdparty/everest/library
3rdparty/everest/library/kremlib
3rdparty/everest/library/legacy
);
# Directories to add to the include path.
# Order matters in case there are files with the same name in more than
# one directory: the compiler will use the first match.
my @include_directories = qw(
include
3rdparty/everest/include/
3rdparty/everest/include/everest
3rdparty/everest/include/everest/vs2013
3rdparty/everest/include/everest/kremlib
tests/include
);
my $include_directories = join(';', map {"../../$_"} @include_directories);
# Directories to add to the include path when building the library, but not
# when building tests or applications.
my @library_include_directories = qw(
library
);
my $library_include_directories =
join(';', map {"../../$_"} (@library_include_directories,
@include_directories));
my @excluded_files = qw(
3rdparty/everest/library/Hacl_Curve25519.c
);
my %excluded_files = ();
foreach (@excluded_files) { $excluded_files{$_} = 1 }
my $vsx_hdr_tpl = <<EOT;
<ClInclude Include="..\\..\\{NAME}" />
EOT
my $vsx_src_tpl = <<EOT;
<ClCompile Include="..\\..\\{NAME}" />
EOT
my $vsx_sln_app_entry_tpl = <<EOT;
Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "{APPNAME}", "{APPNAME}.vcxproj", "{GUID}"
ProjectSection(ProjectDependencies) = postProject
{46CF2D25-6A36-4189-B59C-E4815388E554} = {46CF2D25-6A36-4189-B59C-E4815388E554}
EndProjectSection
EndProject
EOT
my $vsx_sln_conf_entry_tpl = <<EOT;
{GUID}.Debug|Win32.ActiveCfg = Debug|Win32
{GUID}.Debug|Win32.Build.0 = Debug|Win32
{GUID}.Debug|x64.ActiveCfg = Debug|x64
{GUID}.Debug|x64.Build.0 = Debug|x64
{GUID}.Release|Win32.ActiveCfg = Release|Win32
{GUID}.Release|Win32.Build.0 = Release|Win32
{GUID}.Release|x64.ActiveCfg = Release|x64
{GUID}.Release|x64.Build.0 = Release|x64
EOT
exit( main() );
sub check_dirs {
foreach my $d (@thirdparty_header_dirs, @thirdparty_source_dirs) {
if (not (-d $d)) { return 0; }
}
return -d $vsx_dir
&& -d $mbedtls_header_dir
&& -d $psa_header_dir
&& -d $source_dir
&& -d $test_source_dir
&& -d $test_drivers_source_dir
&& -d $test_header_dir
&& -d $test_drivers_header_dir
&& -d $programs_dir;
}
sub slurp_file {
my ($filename) = @_;
local $/ = undef;
open my $fh, '<:crlf', $filename or die "Could not read $filename\n";
my $content = <$fh>;
close $fh;
return $content;
}
sub content_to_file {
my ($content, $filename) = @_;
open my $fh, '>:crlf', $filename or die "Could not write to $filename\n";
print $fh $content;
close $fh;
}
sub gen_app_guid {
my ($path) = @_;
my $guid = md5_hex( "mbedTLS:$path" );
$guid =~ s/(.{8})(.{4})(.{4})(.{4})(.{12})/\U{$1-$2-$3-$4-$5}/;
return $guid;
}
sub gen_app {
my ($path, $template, $dir, $ext) = @_;
my $guid = gen_app_guid( $path );
$path =~ s!/!\\!g;
(my $appname = $path) =~ s/.*\\//;
my $is_test_app = ($path =~ m/^test\\/);
my $srcs = "<ClCompile Include=\"..\\..\\programs\\$path.c\" \/>";
if( $appname eq "ssl_client2" or $appname eq "ssl_server2" or
$appname eq "query_compile_time_config" ) {
$srcs .= "\n <ClCompile Include=\"..\\..\\programs\\test\\query_config.c\" \/>";
}
if( $appname eq "ssl_client2" or $appname eq "ssl_server2" ) {
$srcs .= "\n <ClCompile Include=\"..\\..\\programs\\ssl\\ssl_test_lib.c\" \/>";
}
my $content = $template;
$content =~ s/<SOURCES>/$srcs/g;
$content =~ s/<APPNAME>/$appname/g;
$content =~ s/<GUID>/$guid/g;
$content =~ s/INCLUDE_DIRECTORIES\n/($is_test_app ?
$library_include_directories :
$include_directories)/ge;
content_to_file( $content, "$dir/$appname.$ext" );
}
sub get_app_list {
my $makefile_contents = slurp_file('programs/Makefile');
$makefile_contents =~ /\n\s*APPS\s*=[\\\s]*(.*?)(?<!\\)[\#\n]/s
or die "Cannot find APPS = ... in programs/Makefile\n";
return split /(?:\s|\\)+/, $1;
}
sub gen_app_files {
my @app_list = @_;
my $vsx_tpl = slurp_file( $vsx_app_tpl_file );
for my $app ( @app_list ) {
gen_app( $app, $vsx_tpl, $vsx_dir, $vsx_ext );
}
}
sub gen_entry_list {
my ($tpl, @names) = @_;
my $entries;
for my $name (@names) {
(my $entry = $tpl) =~ s/{NAME}/$name/g;
$entries .= $entry;
}
return $entries;
}
sub gen_main_file {
my ($headers, $sources,
$hdr_tpl, $src_tpl,
$main_tpl, $main_out) = @_;
my $header_entries = gen_entry_list( $hdr_tpl, @$headers );
my $source_entries = gen_entry_list( $src_tpl, @$sources );
my $out = slurp_file( $main_tpl );
$out =~ s/SOURCE_ENTRIES\n/$source_entries/m;
$out =~ s/HEADER_ENTRIES\n/$header_entries/m;
$out =~ s/INCLUDE_DIRECTORIES\n/$library_include_directories/g;
content_to_file( $out, $main_out );
}
sub gen_vsx_solution {
my (@app_names) = @_;
my ($app_entries, $conf_entries);
for my $path (@app_names) {
my $guid = gen_app_guid( $path );
(my $appname = $path) =~ s!.*/!!;
my $app_entry = $vsx_sln_app_entry_tpl;
$app_entry =~ s/{APPNAME}/$appname/g;
$app_entry =~ s/{GUID}/$guid/g;
$app_entries .= $app_entry;
my $conf_entry = $vsx_sln_conf_entry_tpl;
$conf_entry =~ s/{GUID}/$guid/g;
$conf_entries .= $conf_entry;
}
my $out = slurp_file( $vsx_sln_tpl_file );
$out =~ s/APP_ENTRIES\n/$app_entries/m;
$out =~ s/CONF_ENTRIES\n/$conf_entries/m;
content_to_file( $out, $vsx_sln_file );
}
sub del_vsx_files {
unlink glob "'$vsx_dir/*.$vsx_ext'";
unlink $vsx_main_file;
unlink $vsx_sln_file;
}
sub main {
if( ! check_dirs() ) {
chdir '..' or die;
check_dirs or die "Must be run from Mbed TLS root or scripts dir\n";
}
# Remove old files to ensure that, for example, project files from deleted
# apps are not kept
del_vsx_files();
my @app_list = get_app_list();
my @header_dirs = (
$mbedtls_header_dir,
$psa_header_dir,
$test_header_dir,
$test_drivers_header_dir,
$source_dir,
@thirdparty_header_dirs,
);
my @headers = (map { <$_/*.h> } @header_dirs);
my @source_dirs = (
$source_dir,
$test_source_dir,
$test_drivers_source_dir,
@thirdparty_source_dirs,
);
my @sources = (map { <$_/*.c> } @source_dirs);
@headers = grep { ! $excluded_files{$_} } @headers;
@sources = grep { ! $excluded_files{$_} } @sources;
map { s!/!\\!g } @headers;
map { s!/!\\!g } @sources;
gen_app_files( @app_list );
gen_main_file( \@headers, \@sources,
$vsx_hdr_tpl, $vsx_src_tpl,
$vsx_main_tpl_file, $vsx_main_file );
gen_vsx_solution( @app_list );
return 0;
}

82
externals/mbedtls/scripts/lcov.sh vendored Executable file
View file

@ -0,0 +1,82 @@
#!/bin/sh
help () {
cat <<EOF
Usage: $0 [-r]
Collect coverage statistics of library code into an HTML report.
General instructions:
1. Build the library with CFLAGS="--coverage -O0 -g3" and link the test
programs with LDFLAGS="--coverage".
This can be an out-of-tree build.
For example (in-tree):
make CFLAGS="--coverage -O0 -g3" LDFLAGS="--coverage"
Or (out-of-tree):
mkdir build-coverage && cd build-coverage &&
cmake -D CMAKE_BUILD_TYPE=Coverage .. && make
2. Run whatever tests you want.
3. Run this script from the parent of the directory containing the library
object files and coverage statistics files.
4. Browse the coverage report in Coverage/index.html.
5. After rework, run "$0 -r", then re-test and run "$0" to get a fresh report.
Options
-r Reset traces. Run this before re-testing to get fresh measurements.
EOF
}
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
set -eu
# Repository detection
in_mbedtls_build_dir () {
test -d library
}
# Collect stats and build a HTML report.
lcov_library_report () {
rm -rf Coverage
mkdir Coverage Coverage/tmp
lcov --capture --initial --directory $library_dir -o Coverage/tmp/files.info
lcov --rc lcov_branch_coverage=1 --capture --directory $library_dir -o Coverage/tmp/tests.info
lcov --rc lcov_branch_coverage=1 --add-tracefile Coverage/tmp/files.info --add-tracefile Coverage/tmp/tests.info -o Coverage/tmp/all.info
lcov --rc lcov_branch_coverage=1 --remove Coverage/tmp/all.info -o Coverage/tmp/final.info '*.h'
gendesc tests/Descriptions.txt -o Coverage/tmp/descriptions
genhtml --title "$title" --description-file Coverage/tmp/descriptions --keep-descriptions --legend --branch-coverage -o Coverage Coverage/tmp/final.info
rm -f Coverage/tmp/*.info Coverage/tmp/descriptions
echo "Coverage report in: Coverage/index.html"
}
# Reset the traces to 0.
lcov_reset_traces () {
# Location with plain make
rm -f $library_dir/*.gcda
# Location with CMake
rm -f $library_dir/CMakeFiles/*.dir/*.gcda
}
if [ $# -gt 0 ] && [ "$1" = "--help" ]; then
help
exit
fi
if in_mbedtls_build_dir; then
library_dir='library'
title='Mbed TLS'
else
library_dir='core'
title='TF-PSA-Crypto'
fi
main=lcov_library_report
while getopts r OPTLET; do
case $OPTLET in
r) main=lcov_reset_traces;;
*) help 2>&1; exit 120;;
esac
done
shift $((OPTIND - 1))
"$main" "$@"

View file

@ -0,0 +1,10 @@
# Python packages that are only useful to Mbed TLS maintainers.
-r ci.requirements.txt
# For source code analyses
clang
# For building some test vectors
pycryptodomex
pycryptodome-test-vectors

View file

@ -0,0 +1,15 @@
@rem Generate automatically-generated configuration-independent source files
@rem and build scripts.
@rem Perl and Python 3 must be on the PATH.
@rem psa_crypto_driver_wrappers.h needs to be generated prior to
@rem generate_visualc_files.pl being invoked.
python scripts\generate_driver_wrappers.py || exit /b 1
perl scripts\generate_errors.pl || exit /b 1
perl scripts\generate_query_config.pl || exit /b 1
perl scripts\generate_features.pl || exit /b 1
python scripts\generate_ssl_debug_helpers.py || exit /b 1
perl scripts\generate_visualc_files.pl || exit /b 1
python scripts\generate_psa_constants.py || exit /b 1
python tests\scripts\generate_bignum_tests.py || exit /b 1
python tests\scripts\generate_ecp_tests.py || exit /b 1
python tests\scripts\generate_psa_tests.py || exit /b 1

36
externals/mbedtls/scripts/massif_max.pl vendored Executable file
View file

@ -0,0 +1,36 @@
#!/usr/bin/env perl
# Parse a massif.out.xxx file and output peak total memory usage
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
use warnings;
use strict;
use utf8;
use open qw(:std utf8);
die unless @ARGV == 1;
my @snaps;
open my $fh, '<', $ARGV[0] or die;
{ local $/ = 'snapshot='; @snaps = <$fh>; }
close $fh or die;
my ($max, $max_heap, $max_he, $max_stack) = (0, 0, 0, 0);
for (@snaps)
{
my ($heap, $heap_extra, $stack) = m{
mem_heap_B=(\d+)\n
mem_heap_extra_B=(\d+)\n
mem_stacks_B=(\d+)
}xm;
next unless defined $heap;
my $total = $heap + $heap_extra + $stack;
if( $total > $max ) {
($max, $max_heap, $max_he, $max_stack) = ($total, $heap, $heap_extra, $stack);
}
}
printf "$max (heap $max_heap+$max_he, stack $max_stack)\n";

View file

@ -0,0 +1,3 @@
# This file needs to exist to make mbedtls_dev a package.
# Among other things, this allows modules in this directory to make
# relative imports.

View file

@ -0,0 +1,157 @@
"""Sample key material for asymmetric key types.
Meant for use in crypto_knowledge.py.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import binascii
import re
from typing import Dict
STR_TRANS_REMOVE_BLANKS = str.maketrans('', '', ' \t\n\r')
def unhexlify(text: str) -> bytes:
return binascii.unhexlify(text.translate(STR_TRANS_REMOVE_BLANKS))
def construct_asymmetric_key_data(src) -> Dict[str, Dict[int, bytes]]:
"""Split key pairs into separate table entries and convert hex to bytes.
Input format: src[abbreviated_type][size] = (private_key_hex, public_key_hex)
Output format: dst['PSA_KEY_TYPE_xxx'][size] = key_bytes
"""
dst = {} #type: Dict[str, Dict[int, bytes]]
for typ in src:
private = 'PSA_KEY_TYPE_' + re.sub(r'(\(|\Z)', r'_KEY_PAIR\1', typ, 1)
public = 'PSA_KEY_TYPE_' + re.sub(r'(\(|\Z)', r'_PUBLIC_KEY\1', typ, 1)
dst[private] = {}
dst[public] = {}
for size in src[typ]:
dst[private][size] = unhexlify(src[typ][size][0])
dst[public][size] = unhexlify(src[typ][size][1])
return dst
## These are valid keys that don't try to exercise any edge cases. They're
## either test vectors from some specification, or randomly generated. All
## pairs consist of a private key and its public key.
#pylint: disable=line-too-long
ASYMMETRIC_KEY_DATA = construct_asymmetric_key_data({
'ECC(PSA_ECC_FAMILY_SECP_K1)': {
192: ("297ac1722ccac7589ecb240dc719842538ca974beb79f228",
"0426b7bb38da649ac2138fc050c6548b32553dab68afebc36105d325b75538c12323cb0764789ecb992671beb2b6bef2f5"),
225: ("0024122bf020fa113f6c0ac978dfbd41f749257a9468febdbe0dc9f7e8",
"042cc7335f4b76042bed44ef45959a62aa215f7a5ff0c8111b8c44ed654ee71c1918326ad485b2d599fe2a6eab096ee26d977334d2bac6d61d"),
256: ("7fa06fa02d0e911b9a47fdc17d2d962ca01e2f31d60c6212d0ed7e3bba23a7b9",
"045c39154579efd667adc73a81015a797d2c8682cdfbd3c3553c4a185d481cdc50e42a0e1cbc3ca29a32a645e927f54beaed14c9dbbf8279d725f5495ca924b24d"),
},
'ECC(PSA_ECC_FAMILY_SECP_R1)': {
192: ("d83b57a59c51358d9c8bbb898aff507f44dd14cf16917190",
"04e35fcbee11cec3154f80a1a61df7d7612de4f2fd70c5608d0ee3a4a1a5719471adb33966dd9b035fdb774feeba94b04c"),
224: ("872f203b3ad35b7f2ecc803c3a0e1e0b1ed61cc1afe71b189cd4c995",
"046f00eadaa949fee3e9e1c7fa1247eecec86a0dce46418b9bd3117b981d4bd0ae7a990de912f9d060d6cb531a42d22e394ac29e81804bf160"),
256: ("49c9a8c18c4b885638c431cf1df1c994131609b580d4fd43a0cab17db2f13eee",
"047772656f814b399279d5e1f1781fac6f099a3c5ca1b0e35351834b08b65e0b572590cdaf8f769361bcf34acfc11e5e074e8426bdde04be6e653945449617de45"),
384: ("3f5d8d9be280b5696cc5cc9f94cf8af7e6b61dd6592b2ab2b3a4c607450417ec327dcdcaed7c10053d719a0574f0a76a",
"04d9c662b50ba29ca47990450e043aeaf4f0c69b15676d112f622a71c93059af999691c5680d2b44d111579db12f4a413a2ed5c45fcfb67b5b63e00b91ebe59d09a6b1ac2c0c4282aa12317ed5914f999bc488bb132e8342cc36f2ca5e3379c747"),
521: ("01b1b6ad07bb79e7320da59860ea28e055284f6058f279de666e06d435d2af7bda28d99fa47b7dd0963e16b0073078ee8b8a38d966a582f46d19ff95df3ad9685aae",
"04001de142d54f69eb038ee4b7af9d3ca07736fd9cf719eb354d69879ee7f3c136fb0fbf9f08f86be5fa128ec1a051d3e6c643e85ada8ffacf3663c260bd2c844b6f5600cee8e48a9e65d09cadd89f235dee05f3b8a646be715f1f67d5b434e0ff23a1fc07ef7740193e40eeff6f3bcdfd765aa9155033524fe4f205f5444e292c4c2f6ac1"),
},
'ECC(PSA_ECC_FAMILY_SECP_R2)': {
160: ("00bf539a1cdda0d7f71a50a3f98aec0a2e8e4ced1e",
"049570d541398665adb5cfa16f5af73b3196926bbd4b876bdb80f8eab20d0f540c22f4de9c140f6d7b"),
},
'ECC(PSA_ECC_FAMILY_SECT_K1)': {
163: ("03ebc8fcded2d6ab72ec0f75bdb4fd080481273e71",
"0406f88f90b4b65950f06ce433afdb097e320f433dc2062b8a65db8fafd3c110f46bc45663fbf021ee7eb9"),
233: ("41f08485ce587b06061c087e76e247c359de2ba9927ee013b2f1ed9ca8",
"0401e9d7189189f773bd8f71be2c10774ba18842434dfa9312595ea545104400f45a9d5675647513ba75b079fe66a29daac2ec86a6a5d4e75c5f290c1f"),
239: ("1a8069ce2c2c8bdd7087f2a6ab49588797e6294e979495602ab9650b9c61",
"04068d76b9f4508762c2379db9ee8b87ad8d86d9535132ffba3b5680440cfa28eb133d4232faf1c9aba96af11aefe634a551440800d5f8185105d3072d"),
283: ("006d627885dd48b9ec6facb5b3865377d755b75a5d51440e45211c1f600e15eff8a881a0",
"0405f48374debceaadb46ba385fd92048fcc5b9af1a1c90408bf94a68b9378df1cbfdfb6fb026a96bea06d8f181bf10c020adbcc88b6ecff96bdc564a9649c247cede601c4be63afc3"),
409: ("3ff5e74d932fa77db139b7c948c81e4069c72c24845574064beea8976b70267f1c6f9a503e3892ea1dcbb71fcea423faa370a8",
"04012c587f69f68b308ba6dcb238797f4e22290ca939ae806604e2b5ab4d9caef5a74a98fd87c4f88d292dd39d92e556e16c6ecc3c019a105826eef507cd9a04119f54d5d850b3720b3792d5d03410e9105610f7e4b420166ed45604a7a1f229d80975ba6be2060e8b"),
571: ("005008c97b4a161c0db1bac6452c72846d57337aa92d8ecb4a66eb01d2f29555ffb61a5317225dcc8ca6917d91789e227efc0bfe9eeda7ee21998cd11c3c9885056b0e55b4f75d51",
"04050172a7fd7adf98e4e2ed2742faa5cd12731a15fb0dbbdf75b1c3cc771a4369af6f2fa00e802735650881735759ea9c79961ded18e0daa0ac59afb1d513b5bbda9962e435f454fc020b4afe1445c2302ada07d295ec2580f8849b2dfa7f956b09b4cbe4c88d3b1c217049f75d3900d36df0fa12689256b58dd2ef784ebbeb0564600cf47a841485f8cf897a68accd5a"),
},
'ECC(PSA_ECC_FAMILY_SECT_R1)': {
163: ("009b05dc82d46d64a04a22e6e5ca70ca1231e68c50",
"0400465eeb9e7258b11e33c02266bfe834b20bcb118700772796ee4704ec67651bd447e3011959a79a04cb"),
233: ("00e5e42834e3c78758088b905deea975f28dc20ef6173e481f96e88afe7f",
"0400cd68c8af4430c92ec7a7048becfdf00a6bae8d1b4c37286f2d336f2a0e017eca3748f4ad6d435c85867aa014eea1bd6d9d005bbd8319cab629001d"),
283: ("004cecad915f6f3c9bbbd92d1eb101eda23f16c7dad60a57c87c7e1fd2b29b22f6d666ad",
"04052f9ff887254c2d1440ba9e30f13e2185ba53c373b2c410dae21cf8c167f796c08134f601cbc4c570bffbc2433082cf4d9eb5ba173ecb8caec15d66a02673f60807b2daa729b765"),
409: ("00c22422d265721a3ae2b3b2baeb77bee50416e19877af97b5fc1c700a0a88916ecb9050135883accb5e64edc77a3703f4f67a64",
"0401aa25466b1d291846db365957b25431591e50d9c109fe2106e93bb369775896925b15a7bfec397406ab4fe6f6b1a13bf8fdcb9300fa5500a813228676b0a6c572ed96b0f4aec7e87832e7e20f17ca98ecdfd36f59c82bddb8665f1f357a73900e827885ec9e1f22"),
571: ("026ac1cdf92a13a1b8d282da9725847908745138f5c6706b52d164e3675fcfbf86fc3e6ab2de732193267db029dd35a0599a94a118f480231cfc6ccca2ebfc1d8f54176e0f5656a1",
"040708f3403ee9948114855c17572152a08f8054d486defef5f29cbffcfb7cfd9280746a1ac5f751a6ad902ec1e0525120e9be56f03437af196fbe60ee7856e3542ab2cf87880632d80290e39b1a2bd03c6bbf6225511c567bd2ff41d2325dc58346f2b60b1feee4dc8b2af2296c2dc52b153e0556b5d24152b07f690c3fa24e4d1d19efbdeb1037833a733654d2366c74"),
},
'ECC(PSA_ECC_FAMILY_SECT_R2)': {
163: ("0210b482a458b4822d0cb21daa96819a67c8062d34",
"0403692601144c32a6cfa369ae20ae5d43c1c764678c037bafe80c6fd2e42b7ced96171d9c5367fd3dca6f"),
},
'ECC(PSA_ECC_FAMILY_BRAINPOOL_P_R1)': {
160: ("69502c4fdaf48d4fa617bdd24498b0406d0eeaac",
"04d4b9186816358e2f9c59cf70748cb70641b22fbab65473db4b4e22a361ed7e3de7e8a8ddc4130c5c"),
192: ("1688a2c5fbf4a3c851d76a98c3ec88f445a97996283db59f",
"043fdd168c179ff5363dd71dcd58de9617caad791ae0c37328be9ca0bfc79cebabf6a95d1c52df5b5f3c8b1a2441cf6c88"),
224: ("a69835dafeb5da5ab89c59860dddebcfd80b529a99f59b880882923c",
"045fbea378fc8583b3837e3f21a457c31eaf20a54e18eb11d104b3adc47f9d1c97eb9ea4ac21740d70d88514b98bf0bc31addac1d19c4ab3cc"),
256: ("2161d6f2db76526fa62c16f356a80f01f32f776784b36aa99799a8b7662080ff",
"04768c8cae4abca6306db0ed81b0c4a6215c378066ec6d616c146e13f1c7df809b96ab6911c27d8a02339f0926840e55236d3d1efbe2669d090e4c4c660fada91d"),
320: ("61b8daa7a6e5aa9fccf1ef504220b2e5a5b8c6dc7475d16d3172d7db0b2778414e4f6e8fa2032ead",
"049caed8fb4742956cc2ad12a9a1c995e21759ef26a07bc2054136d3d2f28bb331a70e26c4c687275ab1f434be7871e115d2350c0c5f61d4d06d2bcdb67f5cb63fdb794e5947c87dc6849a58694e37e6cd"),
384: ("3dd92e750d90d7d39fc1885cd8ad12ea9441f22b9334b4d965202adb1448ce24c5808a85dd9afc229af0a3124f755bcb",
"04719f9d093a627e0d350385c661cebf00c61923566fe9006a3107af1d871bc6bb68985fd722ea32be316f8e783b7cd1957785f66cfc0cb195dd5c99a8e7abaa848553a584dfd2b48e76d445fe00dd8be59096d877d4696d23b4bc8db14724e66a"),
512: ("372c9778f69f726cbca3f4a268f16b4d617d10280d79a6a029cd51879fe1012934dfe5395455337df6906dc7d6d2eea4dbb2065c0228f73b3ed716480e7d71d2",
"0438b7ec92b61c5c6c7fbc28a4ec759d48fcd4e2e374defd5c4968a54dbef7510e517886fbfc38ea39aa529359d70a7156c35d3cbac7ce776bdb251dd64bce71234424ee7049eed072f0dbc4d79996e175d557e263763ae97095c081e73e7db2e38adc3d4c9a0487b1ede876dc1fca61c902e9a1d8722b8612928f18a24845591a"),
},
'ECC(PSA_ECC_FAMILY_MONTGOMERY)': {
255: ("70076d0a7318a57d3c16c17251b26645df4c2f87ebc0992ab177fba51db92c6a",
"8520f0098930a754748b7ddcb43ef75a0dbf3a0d26381af4eba4a98eaa9b4e6a"),
448: ("e4e49f52686f9ee3b638528f721f1596196ffd0a1cddb64c3f216f06541805cfeb1a286dc78018095cdfec050e8007b5f4908962ba20d6c1",
"c0d3a5a2b416a573dc9909f92f134ac01323ab8f8e36804e578588ba2d09fe7c3e737f771ca112825b548a0ffded6d6a2fd09a3e77dec30e"),
},
'ECC(PSA_ECC_FAMILY_TWISTED_EDWARDS)': {
255: ("9d61b19deffd5a60ba844af492ec2cc44449c5697b326919703bac031cae7f60",
"d75a980182b10ab7d54bfed3c964073a0ee172f3daa62325af021a68f707511a"),
448: ("6c82a562cb808d10d632be89c8513ebf6c929f34ddfa8c9f63c9960ef6e348a3528c8a3fcc2f044e39a3fc5b94492f8f032e7549a20098f95b",
"5fd7449b59b461fd2ce787ec616ad46a1da1342485a70e1f8a0ea75d80e96778edf124769b46c7061bd6783df1e50f6cd1fa1abeafe8256180"),
},
'RSA': {
1024: ("""
3082025e
020100
02818100af057d396ee84fb75fdbb5c2b13c7fe5a654aa8aa2470b541ee1feb0b12d25c79711531249e1129628042dbbb6c120d1443524ef4c0e6e1d8956eeb2077af12349ddeee54483bc06c2c61948cd02b202e796aebd94d3a7cbf859c2c1819c324cb82b9cd34ede263a2abffe4733f077869e8660f7d6834da53d690ef7985f6bc3
0203010001
02818100874bf0ffc2f2a71d14671ddd0171c954d7fdbf50281e4f6d99ea0e1ebcf82faa58e7b595ffb293d1abe17f110b37c48cc0f36c37e84d876621d327f64bbe08457d3ec4098ba2fa0a319fba411c2841ed7be83196a8cdf9daa5d00694bc335fc4c32217fe0488bce9cb7202e59468b1ead119000477db2ca797fac19eda3f58c1
024100e2ab760841bb9d30a81d222de1eb7381d82214407f1b975cbbfe4e1a9467fd98adbd78f607836ca5be1928b9d160d97fd45c12d6b52e2c9871a174c66b488113
024100c5ab27602159ae7d6f20c3c2ee851e46dc112e689e28d5fcbbf990a99ef8a90b8bb44fd36467e7fc1789ceb663abda338652c3c73f111774902e840565927091
024100b6cdbd354f7df579a63b48b3643e353b84898777b48b15f94e0bfc0567a6ae5911d57ad6409cf7647bf96264e9bd87eb95e263b7110b9a1f9f94acced0fafa4d
024071195eec37e8d257decfc672b07ae639f10cbb9b0c739d0c809968d644a94e3fd6ed9287077a14583f379058f76a8aecd43c62dc8c0f41766650d725275ac4a1
024100bb32d133edc2e048d463388b7be9cb4be29f4b6250be603e70e3647501c97ddde20a4e71be95fd5e71784e25aca4baf25be5738aae59bbfe1c997781447a2b24
""", """
308189
02818100af057d396ee84fb75fdbb5c2b13c7fe5a654aa8aa2470b541ee1feb0b12d25c79711531249e1129628042dbbb6c120d1443524ef4c0e6e1d8956eeb2077af12349ddeee54483bc06c2c61948cd02b202e796aebd94d3a7cbf859c2c1819c324cb82b9cd34ede263a2abffe4733f077869e8660f7d6834da53d690ef7985f6bc3
0203010001
"""),
1536: ("""
3082037b
020100
0281c100c870feb6ca6b1d2bd9f2dd99e20f1fe2d7e5192de662229dbe162bd1ba66336a7182903ca0b72796cd441c83d24bcdc3e9a2f5e4399c8a043f1c3ddf04754a66d4cfe7b3671a37dd31a9b4c13bfe06ee90f9d94ddaa06de67a52ac863e68f756736ceb014405a6160579640f831dddccc34ad0b05070e3f9954a58d1815813e1b83bcadba814789c87f1ef2ba5d738b793ec456a67360eea1b5faf1c7cc7bf24f3b2a9d0f8958b1096e0f0c335f8888d0c63a51c3c0337214fa3f5efdf6dcc35
0203010001
0281c06d2d670047973a87752a9d5bc14f3dae00acb01f593aa0e24cf4a49f932931de4bbfb332e2d38083da80bc0b6d538edba479f7f77d0deffb4a28e6e67ff6273585bb4cd862535c946605ab0809d65f0e38f76e4ec2c3d9b8cd6e14bcf667943892cd4b34cc6420a439abbf3d7d35ef73976dd6f9cbde35a51fa5213f0107f83e3425835d16d3c9146fc9e36ce75a09bb66cdff21dd5a776899f1cb07e282cca27be46510e9c799f0d8db275a6be085d9f3f803218ee3384265bfb1a3640e8ca1
026100e6848c31d466fffefc547e3a3b0d3785de6f78b0dd12610843512e495611a0675509b1650b27415009838dd8e68eec6e7530553b637d602424643b33e8bc5b762e1799bc79d56b13251d36d4f201da2182416ce13574e88278ff04467ad602d9
026100de994fdf181f02be2bf9e5f5e4e517a94993b827d1eaf609033e3a6a6f2396ae7c44e9eb594cf1044cb3ad32ea258f0c82963b27bb650ed200cde82cb993374be34be5b1c7ead5446a2b82a4486e8c1810a0b01551609fb0841d474bada802bd
026076ddae751b73a959d0bfb8ff49e7fcd378e9be30652ecefe35c82cb8003bc29cc60ae3809909baf20c95db9516fe680865417111d8b193dbcf30281f1249de57c858bf1ba32f5bb1599800e8398a9ef25c7a642c95261da6f9c17670e97265b1
0260732482b837d5f2a9443e23c1aa0106d83e82f6c3424673b5fdc3769c0f992d1c5c93991c7038e882fcda04414df4d7a5f4f698ead87851ce37344b60b72d7b70f9c60cae8566e7a257f8e1bef0e89df6e4c2f9d24d21d9f8889e4c7eccf91751
026009050d94493da8f00a4ddbe9c800afe3d44b43f78a48941a79b2814a1f0b81a18a8b2347642a03b27998f5a18de9abc9ae0e54ab8294feac66dc87e854cce6f7278ac2710cb5878b592ffeb1f4f0a1853e4e8d1d0561b6efcc831a296cf7eeaf
""", """
3081c9
0281c100c870feb6ca6b1d2bd9f2dd99e20f1fe2d7e5192de662229dbe162bd1ba66336a7182903ca0b72796cd441c83d24bcdc3e9a2f5e4399c8a043f1c3ddf04754a66d4cfe7b3671a37dd31a9b4c13bfe06ee90f9d94ddaa06de67a52ac863e68f756736ceb014405a6160579640f831dddccc34ad0b05070e3f9954a58d1815813e1b83bcadba814789c87f1ef2ba5d738b793ec456a67360eea1b5faf1c7cc7bf24f3b2a9d0f8958b1096e0f0c335f8888d0c63a51c3c0337214fa3f5efdf6dcc35
0203010001
"""),
},
})

View file

@ -0,0 +1,406 @@
"""Common features for bignum in test generation framework."""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
from abc import abstractmethod
import enum
from typing import Iterator, List, Tuple, TypeVar, Any
from copy import deepcopy
from itertools import chain
from math import ceil
from . import test_case
from . import test_data_generation
from .bignum_data import INPUTS_DEFAULT, MODULI_DEFAULT
T = TypeVar('T') #pylint: disable=invalid-name
def invmod(a: int, n: int) -> int:
"""Return inverse of a to modulo n.
Equivalent to pow(a, -1, n) in Python 3.8+. Implementation is equivalent
to long_invmod() in CPython.
"""
b, c = 1, 0
while n:
q, r = divmod(a, n)
a, b, c, n = n, c, b - q*c, r
# at this point a is the gcd of the original inputs
if a == 1:
return b
raise ValueError("Not invertible")
def invmod_positive(a: int, n: int) -> int:
"""Return a non-negative inverse of a to modulo n."""
inv = invmod(a, n)
return inv if inv >= 0 else inv + n
def hex_to_int(val: str) -> int:
"""Implement the syntax accepted by mbedtls_test_read_mpi().
This is a superset of what is accepted by mbedtls_test_read_mpi_core().
"""
if val in ['', '-']:
return 0
return int(val, 16)
def quote_str(val: str) -> str:
return "\"{}\"".format(val)
def bound_mpi(val: int, bits_in_limb: int) -> int:
"""First number exceeding number of limbs needed for given input value."""
return bound_mpi_limbs(limbs_mpi(val, bits_in_limb), bits_in_limb)
def bound_mpi_limbs(limbs: int, bits_in_limb: int) -> int:
"""First number exceeding maximum of given number of limbs."""
bits = bits_in_limb * limbs
return 1 << bits
def limbs_mpi(val: int, bits_in_limb: int) -> int:
"""Return the number of limbs required to store value."""
bit_length = max(val.bit_length(), 1)
return (bit_length + bits_in_limb - 1) // bits_in_limb
def combination_pairs(values: List[T]) -> List[Tuple[T, T]]:
"""Return all pair combinations from input values."""
return [(x, y) for x in values for y in values]
def bits_to_limbs(bits: int, bits_in_limb: int) -> int:
""" Return the appropriate ammount of limbs needed to store
a number contained in input bits"""
return ceil(bits / bits_in_limb)
def hex_digits_for_limb(limbs: int, bits_in_limb: int) -> int:
""" Return the hex digits need for a number of limbs. """
return 2 * ((limbs * bits_in_limb) // 8)
def hex_digits_max_int(val: str, bits_in_limb: int) -> int:
""" Return the first number exceeding maximum the limb space
required to store the input hex-string value. This method
weights on the input str_len rather than numerical value
and works with zero-padded inputs"""
n = ((1 << (len(val) * 4)) - 1)
l = limbs_mpi(n, bits_in_limb)
return bound_mpi_limbs(l, bits_in_limb)
def zfill_match(reference: str, target: str) -> str:
""" Zero pad target hex-string to match the limb size of
the reference input """
lt = len(target)
lr = len(reference)
target_len = lr if lt < lr else lt
return "{:x}".format(int(target, 16)).zfill(target_len)
class OperationCommon(test_data_generation.BaseTest):
"""Common features for bignum binary operations.
This adds functionality common in binary operation tests.
Attributes:
symbol: Symbol to use for the operation in case description.
input_values: List of values to use as test case inputs. These are
combined to produce pairs of values.
input_cases: List of tuples containing pairs of test case inputs. This
can be used to implement specific pairs of inputs.
unique_combinations_only: Boolean to select if test case combinations
must be unique. If True, only A,B or B,A would be included as a test
case. If False, both A,B and B,A would be included.
input_style: Controls the way how test data is passed to the functions
in the generated test cases. "variable" passes them as they are
defined in the python source. "arch_split" pads the values with
zeroes depending on the architecture/limb size. If this is set,
test cases are generated for all architectures.
arity: the number of operands for the operation. Currently supported
values are 1 and 2.
"""
symbol = ""
input_values = INPUTS_DEFAULT # type: List[str]
input_cases = [] # type: List[Any]
dependencies = [] # type: List[Any]
unique_combinations_only = False
input_styles = ["variable", "fixed", "arch_split"] # type: List[str]
input_style = "variable" # type: str
limb_sizes = [32, 64] # type: List[int]
arities = [1, 2]
arity = 2
suffix = False # for arity = 1, symbol can be prefix (default) or suffix
def __init__(self, val_a: str, val_b: str = "0", bits_in_limb: int = 32) -> None:
self.val_a = val_a
self.val_b = val_b
# Setting the int versions here as opposed to making them @properties
# provides earlier/more robust input validation.
self.int_a = hex_to_int(val_a)
self.int_b = hex_to_int(val_b)
self.dependencies = deepcopy(self.dependencies)
if bits_in_limb not in self.limb_sizes:
raise ValueError("Invalid number of bits in limb!")
if self.input_style == "arch_split":
self.dependencies.append("MBEDTLS_HAVE_INT{:d}".format(bits_in_limb))
self.bits_in_limb = bits_in_limb
@property
def boundary(self) -> int:
if self.arity == 1:
return self.int_a
elif self.arity == 2:
return max(self.int_a, self.int_b)
raise ValueError("Unsupported number of operands!")
@property
def limb_boundary(self) -> int:
return bound_mpi(self.boundary, self.bits_in_limb)
@property
def limbs(self) -> int:
return limbs_mpi(self.boundary, self.bits_in_limb)
@property
def hex_digits(self) -> int:
return hex_digits_for_limb(self.limbs, self.bits_in_limb)
def format_arg(self, val: str) -> str:
if self.input_style not in self.input_styles:
raise ValueError("Unknown input style!")
if self.input_style == "variable":
return val
else:
return val.zfill(self.hex_digits)
def format_result(self, res: int) -> str:
res_str = '{:x}'.format(res)
return quote_str(self.format_arg(res_str))
@property
def arg_a(self) -> str:
return self.format_arg(self.val_a)
@property
def arg_b(self) -> str:
if self.arity == 1:
raise AttributeError("Operation is unary and doesn't have arg_b!")
return self.format_arg(self.val_b)
def arguments(self) -> List[str]:
args = [quote_str(self.arg_a)]
if self.arity == 2:
args.append(quote_str(self.arg_b))
return args + self.result()
def description(self) -> str:
"""Generate a description for the test case.
If not set, case_description uses the form A `symbol` B, where symbol
is used to represent the operation. Descriptions of each value are
generated to provide some context to the test case.
"""
if not self.case_description:
if self.arity == 1:
format_string = "{1:x} {0}" if self.suffix else "{0} {1:x}"
self.case_description = format_string.format(
self.symbol, self.int_a
)
elif self.arity == 2:
self.case_description = "{:x} {} {:x}".format(
self.int_a, self.symbol, self.int_b
)
return super().description()
@property
def is_valid(self) -> bool:
return True
@abstractmethod
def result(self) -> List[str]:
"""Get the result of the operation.
This could be calculated during initialization and stored as `_result`
and then returned, or calculated when the method is called.
"""
raise NotImplementedError
@classmethod
def get_value_pairs(cls) -> Iterator[Tuple[str, str]]:
"""Generator to yield pairs of inputs.
Combinations are first generated from all input values, and then
specific cases provided.
"""
if cls.arity == 1:
yield from ((a, "0") for a in cls.input_values)
elif cls.arity == 2:
if cls.unique_combinations_only:
yield from combination_pairs(cls.input_values)
else:
yield from (
(a, b)
for a in cls.input_values
for b in cls.input_values
)
else:
raise ValueError("Unsupported number of operands!")
@classmethod
def generate_function_tests(cls) -> Iterator[test_case.TestCase]:
if cls.input_style not in cls.input_styles:
raise ValueError("Unknown input style!")
if cls.arity not in cls.arities:
raise ValueError("Unsupported number of operands!")
if cls.input_style == "arch_split":
test_objects = (cls(a, b, bits_in_limb=bil)
for a, b in cls.get_value_pairs()
for bil in cls.limb_sizes)
special_cases = (cls(*args, bits_in_limb=bil) # type: ignore
for args in cls.input_cases
for bil in cls.limb_sizes)
else:
test_objects = (cls(a, b)
for a, b in cls.get_value_pairs())
special_cases = (cls(*args) for args in cls.input_cases)
yield from (valid_test_object.create_test_case()
for valid_test_object in filter(
lambda test_object: test_object.is_valid,
chain(test_objects, special_cases)
)
)
class ModulusRepresentation(enum.Enum):
"""Representation selector of a modulus."""
# Numerical values aligned with the type mbedtls_mpi_mod_rep_selector
INVALID = 0
MONTGOMERY = 2
OPT_RED = 3
def symbol(self) -> str:
"""The C symbol for this representation selector."""
return 'MBEDTLS_MPI_MOD_REP_' + self.name
@classmethod
def supported_representations(cls) -> List['ModulusRepresentation']:
"""Return all representations that are supported in positive test cases."""
return [cls.MONTGOMERY, cls.OPT_RED]
class ModOperationCommon(OperationCommon):
#pylint: disable=abstract-method
"""Target for bignum mod_raw test case generation."""
moduli = MODULI_DEFAULT # type: List[str]
montgomery_form_a = False
disallow_zero_a = False
def __init__(self, val_n: str, val_a: str, val_b: str = "0",
bits_in_limb: int = 64) -> None:
super().__init__(val_a=val_a, val_b=val_b, bits_in_limb=bits_in_limb)
self.val_n = val_n
# Setting the int versions here as opposed to making them @properties
# provides earlier/more robust input validation.
self.int_n = hex_to_int(val_n)
def to_montgomery(self, val: int) -> int:
return (val * self.r) % self.int_n
def from_montgomery(self, val: int) -> int:
return (val * self.r_inv) % self.int_n
def convert_from_canonical(self, canonical: int,
rep: ModulusRepresentation) -> int:
"""Convert values from canonical representation to the given representation."""
if rep is ModulusRepresentation.MONTGOMERY:
return self.to_montgomery(canonical)
elif rep is ModulusRepresentation.OPT_RED:
return canonical
else:
raise ValueError('Modulus representation not supported: {}'
.format(rep.name))
@property
def boundary(self) -> int:
return self.int_n
@property
def arg_a(self) -> str:
if self.montgomery_form_a:
value_a = self.to_montgomery(self.int_a)
else:
value_a = self.int_a
return self.format_arg('{:x}'.format(value_a))
@property
def arg_n(self) -> str:
return self.format_arg(self.val_n)
def format_arg(self, val: str) -> str:
return super().format_arg(val).zfill(self.hex_digits)
def arguments(self) -> List[str]:
return [quote_str(self.arg_n)] + super().arguments()
@property
def r(self) -> int: # pylint: disable=invalid-name
l = limbs_mpi(self.int_n, self.bits_in_limb)
return bound_mpi_limbs(l, self.bits_in_limb)
@property
def r_inv(self) -> int:
return invmod(self.r, self.int_n)
@property
def r2(self) -> int: # pylint: disable=invalid-name
return pow(self.r, 2)
@property
def is_valid(self) -> bool:
if self.int_a >= self.int_n:
return False
if self.disallow_zero_a and self.int_a == 0:
return False
if self.arity == 2 and self.int_b >= self.int_n:
return False
return True
def description(self) -> str:
"""Generate a description for the test case.
It uses the form A `symbol` B mod N, where symbol is used to represent
the operation.
"""
if not self.case_description:
return super().description() + " mod {:x}".format(self.int_n)
return super().description()
@classmethod
def input_cases_args(cls) -> Iterator[Tuple[Any, Any, Any]]:
if cls.arity == 1:
yield from ((n, a, "0") for a, n in cls.input_cases)
elif cls.arity == 2:
yield from ((n, a, b) for a, b, n in cls.input_cases)
else:
raise ValueError("Unsupported number of operands!")
@classmethod
def generate_function_tests(cls) -> Iterator[test_case.TestCase]:
if cls.input_style not in cls.input_styles:
raise ValueError("Unknown input style!")
if cls.arity not in cls.arities:
raise ValueError("Unsupported number of operands!")
if cls.input_style == "arch_split":
test_objects = (cls(n, a, b, bits_in_limb=bil)
for n in cls.moduli
for a, b in cls.get_value_pairs()
for bil in cls.limb_sizes)
special_cases = (cls(*args, bits_in_limb=bil)
for args in cls.input_cases_args()
for bil in cls.limb_sizes)
else:
test_objects = (cls(n, a, b)
for n in cls.moduli
for a, b in cls.get_value_pairs())
special_cases = (cls(*args) for args in cls.input_cases_args())
yield from (valid_test_object.create_test_case()
for valid_test_object in filter(
lambda test_object: test_object.is_valid,
chain(test_objects, special_cases)
))

View file

@ -0,0 +1,896 @@
"""Framework classes for generation of bignum core test cases."""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import random
from typing import Dict, Iterator, List, Tuple
from . import test_case
from . import test_data_generation
from . import bignum_common
from .bignum_data import ADD_SUB_DATA
class BignumCoreTarget(test_data_generation.BaseTarget):
#pylint: disable=abstract-method, too-few-public-methods
"""Target for bignum core test case generation."""
target_basename = 'test_suite_bignum_core.generated'
class BignumCoreShiftR(BignumCoreTarget, test_data_generation.BaseTest):
"""Test cases for mbedtls_bignum_core_shift_r()."""
count = 0
test_function = "mpi_core_shift_r"
test_name = "Core shift right"
DATA = [
('00', '0', [0, 1, 8]),
('01', '1', [0, 1, 2, 8, 64]),
('dee5ca1a7ef10a75', '64-bit',
list(range(11)) + [31, 32, 33, 63, 64, 65, 71, 72]),
('002e7ab0070ad57001', '[leading 0 limb]',
[0, 1, 8, 63, 64]),
('a1055eb0bb1efa1150ff', '80-bit',
[0, 1, 8, 63, 64, 65, 72, 79, 80, 81, 88, 128, 129, 136]),
('020100000000000000001011121314151617', '138-bit',
[0, 1, 8, 9, 16, 72, 73, 136, 137, 138, 144]),
]
def __init__(self, input_hex: str, descr: str, count: int) -> None:
self.input_hex = input_hex
self.number_description = descr
self.shift_count = count
self.result = bignum_common.hex_to_int(input_hex) >> count
def arguments(self) -> List[str]:
return ['"{}"'.format(self.input_hex),
str(self.shift_count),
'"{:0{}x}"'.format(self.result, len(self.input_hex))]
def description(self) -> str:
return 'Core shift {} >> {}'.format(self.number_description,
self.shift_count)
@classmethod
def generate_function_tests(cls) -> Iterator[test_case.TestCase]:
for input_hex, descr, counts in cls.DATA:
for count in counts:
yield cls(input_hex, descr, count).create_test_case()
class BignumCoreShiftL(BignumCoreTarget, bignum_common.ModOperationCommon):
"""Test cases for mbedtls_bignum_core_shift_l()."""
BIT_SHIFT_VALUES = ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9', 'a',
'1f', '20', '21', '3f', '40', '41', '47', '48', '4f',
'50', '51', '58', '80', '81', '88']
DATA = ["0", "1", "40", "dee5ca1a7ef10a75", "a1055eb0bb1efa1150ff",
"002e7ab0070ad57001", "020100000000000000001011121314151617",
"1946e2958a85d8863ae21f4904fcc49478412534ed53eaf321f63f2a222"
"7a3c63acbf50b6305595f90cfa8327f6db80d986fe96080bcbb5df1bdbe"
"9b74fb8dedf2bddb3f8215b54dffd66409323bcc473e45a8fe9d08e77a51"
"1698b5dad0416305db7fcf"]
arity = 1
test_function = "mpi_core_shift_l"
test_name = "Core shift(L)"
input_style = "arch_split"
symbol = "<<"
input_values = BIT_SHIFT_VALUES
moduli = DATA
@property
def val_n_max_limbs(self) -> int:
""" Return the limb count required to store the maximum number that can
fit in a the number of digits used by val_n """
m = bignum_common.hex_digits_max_int(self.val_n, self.bits_in_limb) - 1
return bignum_common.limbs_mpi(m, self.bits_in_limb)
def arguments(self) -> List[str]:
return [bignum_common.quote_str(self.val_n),
str(self.int_a)
] + self.result()
def description(self) -> str:
""" Format the output as:
#{count} {hex input} ({input bits} {limbs capacity}) << {bit shift} """
bits = "({} bits in {} limbs)".format(self.int_n.bit_length(), self.val_n_max_limbs)
return "{} #{} {} {} {} {}".format(self.test_name,
self.count,
self.val_n,
bits,
self.symbol,
self.int_a)
def format_result(self, res: int) -> str:
# Override to match zero-pading for leading digits between the output and input.
res_str = bignum_common.zfill_match(self.val_n, "{:x}".format(res))
return bignum_common.quote_str(res_str)
def result(self) -> List[str]:
result = (self.int_n << self.int_a)
# Calculate if there is space for shifting to the left(leading zero limbs)
mx = bignum_common.hex_digits_max_int(self.val_n, self.bits_in_limb)
# If there are empty limbs ahead, adjust the bitmask accordingly
result = result & (mx - 1)
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
class BignumCoreCTLookup(BignumCoreTarget, test_data_generation.BaseTest):
"""Test cases for mbedtls_mpi_core_ct_uint_table_lookup()."""
test_function = "mpi_core_ct_uint_table_lookup"
test_name = "Constant time MPI table lookup"
bitsizes = [
(32, "One limb"),
(192, "Smallest curve sized"),
(512, "Largest curve sized"),
(2048, "Small FF/RSA sized"),
(4096, "Large FF/RSA sized"),
]
window_sizes = [0, 1, 2, 3, 4, 5, 6]
def __init__(self,
bitsize: int, descr: str, window_size: int) -> None:
self.bitsize = bitsize
self.bitsize_description = descr
self.window_size = window_size
def arguments(self) -> List[str]:
return [str(self.bitsize), str(self.window_size)]
def description(self) -> str:
return '{} - {} MPI with {} bit window'.format(
BignumCoreCTLookup.test_name,
self.bitsize_description,
self.window_size
)
@classmethod
def generate_function_tests(cls) -> Iterator[test_case.TestCase]:
for bitsize, bitsize_description in cls.bitsizes:
for window_size in cls.window_sizes:
yield (cls(bitsize, bitsize_description, window_size)
.create_test_case())
class BignumCoreAddAndAddIf(BignumCoreTarget, bignum_common.OperationCommon):
"""Test cases for bignum core add and add-if."""
count = 0
symbol = "+"
test_function = "mpi_core_add_and_add_if"
test_name = "mpi_core_add_and_add_if"
input_style = "arch_split"
input_values = ADD_SUB_DATA
unique_combinations_only = True
def result(self) -> List[str]:
result = self.int_a + self.int_b
carry, result = divmod(result, self.limb_boundary)
return [
self.format_result(result),
str(carry)
]
class BignumCoreSub(BignumCoreTarget, bignum_common.OperationCommon):
"""Test cases for bignum core sub."""
count = 0
input_style = "arch_split"
symbol = "-"
test_function = "mpi_core_sub"
test_name = "mbedtls_mpi_core_sub"
input_values = ADD_SUB_DATA
def result(self) -> List[str]:
if self.int_a >= self.int_b:
result = self.int_a - self.int_b
carry = 0
else:
result = self.limb_boundary + self.int_a - self.int_b
carry = 1
return [
self.format_result(result),
str(carry)
]
class BignumCoreMLA(BignumCoreTarget, bignum_common.OperationCommon):
"""Test cases for fixed-size multiply accumulate."""
count = 0
test_function = "mpi_core_mla"
test_name = "mbedtls_mpi_core_mla"
input_values = [
"0", "1", "fffe", "ffffffff", "100000000", "20000000000000",
"ffffffffffffffff", "10000000000000000", "1234567890abcdef0",
"fffffffffffffffffefefefefefefefe",
"100000000000000000000000000000000",
"1234567890abcdef01234567890abcdef0",
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff",
"1234567890abcdef01234567890abcdef01234567890abcdef01234567890abcdef0",
(
"4df72d07b4b71c8dacb6cffa954f8d88254b6277099308baf003fab73227f"
"34029643b5a263f66e0d3c3fa297ef71755efd53b8fb6cb812c6bbf7bcf17"
"9298bd9947c4c8b14324140a2c0f5fad7958a69050a987a6096e9f055fb38"
"edf0c5889eca4a0cfa99b45fbdeee4c696b328ddceae4723945901ec02507"
"6b12b"
)
] # type: List[str]
input_scalars = [
"0", "3", "fe", "ff", "ffff", "10000", "ffffffff", "100000000",
"7f7f7f7f7f7f7f7f", "8000000000000000", "fffffffffffffffe"
] # type: List[str]
def __init__(self, val_a: str, val_b: str, val_s: str) -> None:
super().__init__(val_a, val_b)
self.arg_scalar = val_s
self.int_scalar = bignum_common.hex_to_int(val_s)
if bignum_common.limbs_mpi(self.int_scalar, 32) > 1:
self.dependencies = ["MBEDTLS_HAVE_INT64"]
def arguments(self) -> List[str]:
return [
bignum_common.quote_str(self.arg_a),
bignum_common.quote_str(self.arg_b),
bignum_common.quote_str(self.arg_scalar)
] + self.result()
def description(self) -> str:
"""Override and add the additional scalar."""
if not self.case_description:
self.case_description = "0x{} + 0x{} * 0x{}".format(
self.arg_a, self.arg_b, self.arg_scalar
)
return super().description()
def result(self) -> List[str]:
result = self.int_a + (self.int_b * self.int_scalar)
bound_val = max(self.int_a, self.int_b)
bound_4 = bignum_common.bound_mpi(bound_val, 32)
bound_8 = bignum_common.bound_mpi(bound_val, 64)
carry_4, remainder_4 = divmod(result, bound_4)
carry_8, remainder_8 = divmod(result, bound_8)
return [
"\"{:x}\"".format(remainder_4),
"\"{:x}\"".format(carry_4),
"\"{:x}\"".format(remainder_8),
"\"{:x}\"".format(carry_8)
]
@classmethod
def get_value_pairs(cls) -> Iterator[Tuple[str, str]]:
"""Generator to yield pairs of inputs.
Combinations are first generated from all input values, and then
specific cases provided.
"""
yield from super().get_value_pairs()
yield from cls.input_cases
@classmethod
def generate_function_tests(cls) -> Iterator[test_case.TestCase]:
"""Override for additional scalar input."""
for a_value, b_value in cls.get_value_pairs():
for s_value in cls.input_scalars:
cur_op = cls(a_value, b_value, s_value)
yield cur_op.create_test_case()
class BignumCoreMul(BignumCoreTarget, bignum_common.OperationCommon):
"""Test cases for bignum core multiplication."""
count = 0
input_style = "arch_split"
symbol = "*"
test_function = "mpi_core_mul"
test_name = "mbedtls_mpi_core_mul"
arity = 2
unique_combinations_only = True
def format_arg(self, val: str) -> str:
return val
def format_result(self, res: int) -> str:
res_str = '{:x}'.format(res)
a_limbs = bignum_common.limbs_mpi(self.int_a, self.bits_in_limb)
b_limbs = bignum_common.limbs_mpi(self.int_b, self.bits_in_limb)
hex_digits = bignum_common.hex_digits_for_limb(a_limbs + b_limbs, self.bits_in_limb)
return bignum_common.quote_str(self.format_arg(res_str).zfill(hex_digits))
def result(self) -> List[str]:
result = self.int_a * self.int_b
return [self.format_result(result)]
class BignumCoreMontmul(BignumCoreTarget, test_data_generation.BaseTest):
"""Test cases for Montgomery multiplication."""
count = 0
test_function = "mpi_core_montmul"
test_name = "mbedtls_mpi_core_montmul"
start_2_mpi4 = False
start_2_mpi8 = False
replay_test_cases = [
(2, 1, 1, 1, "19", "1", "1D"), (2, 1, 1, 1, "7", "1", "9"),
(2, 1, 1, 1, "4", "1", "9"),
(
12, 1, 6, 1, (
"3C246D0E059A93A266288A7718419EC741661B474C58C032C5EDAF92709402"
"B07CC8C7CE0B781C641A1EA8DB2F4343"
), "1", (
"66A198186C18C10B2F5ED9B522752A9830B69916E535C8F047518A889A43A5"
"94B6BED27A168D31D4A52F88925AA8F5"
)
), (
8, 1, 4, 1,
"1E442976B0E63D64FCCE74B999E470CA9888165CB75BFA1F340E918CE03C6211",
"1", "B3A119602EE213CDE28581ECD892E0F592A338655DCE4CA88054B3D124D0E561"
), (
22, 1, 11, 1, (
"7CF5AC97304E0B63C65413F57249F59994B0FED1D2A8D3D83ED5FA38560FFB"
"82392870D6D08F87D711917FD7537E13B7E125BE407E74157776839B0AC9DB"
"23CBDFC696104353E4D2780B2B4968F8D8542306BCA7A2366E"
), "1", (
"284139EA19C139EBE09A8111926AAA39A2C2BE12ED487A809D3CB5BC558547"
"25B4CDCB5734C58F90B2F60D99CC1950CDBC8D651793E93C9C6F0EAD752500"
"A32C56C62082912B66132B2A6AA42ADA923E1AD22CEB7BA0123"
)
)
] # type: List[Tuple[int, int, int, int, str, str, str]]
random_test_cases = [
("2", "2", "3", ""), ("1", "2", "3", ""), ("2", "1", "3", ""),
("6", "5", "7", ""), ("3", "4", "7", ""), ("1", "6", "7", ""), ("5", "6", "7", ""),
("3", "4", "B", ""), ("7", "4", "B", ""), ("9", "7", "B", ""), ("2", "a", "B", ""),
("25", "16", "29", "(0x29 is prime)"), ("8", "28", "29", ""),
("18", "21", "29", ""), ("15", "f", "29", ""),
("e2", "ea", "FF", ""), ("43", "72", "FF", ""),
("d8", "70", "FF", ""), ("3c", "7c", "FF", ""),
("99", "b9", "101", "(0x101 is prime)"), ("65", "b2", "101", ""),
("81", "32", "101", ""), ("51", "dd", "101", ""),
("d5", "143", "38B", "(0x38B is prime)"), ("3d", "387", "38B", ""),
("160", "2e5", "38B", ""), ("10f", "137", "38B", ""),
("7dac", "25a", "8003", "(0x8003 is prime)"), ("6f1c", "3286", "8003", ""),
("59ed", "2f3f", "8003", ""), ("6893", "736d", "8003", ""),
("d199", "2832", "10001", "(0x10001 is prime)"), ("c3b2", "3e5b", "10001", ""),
("abe4", "214e", "10001", ""), ("4360", "a05d", "10001", ""),
("3f5a1", "165b2", "7F7F7", ""), ("3bd29", "37863", "7F7F7", ""),
("60c47", "64819", "7F7F7", ""), ("16584", "12c49", "7F7F7", ""),
("1ff03f", "610347", "800009", "(0x800009 is prime)"), ("340fd5", "19812e", "800009", ""),
("3fe2e8", "4d0dc7", "800009", ""), ("40356", "e6392", "800009", ""),
("dd8a1d", "266c0e", "100002B", "(0x100002B is prime)"),
("3fa1cb", "847fd6", "100002B", ""), ("5f439d", "5c3196", "100002B", ""),
("18d645", "f72dc6", "100002B", ""),
("20051ad", "37def6e", "37EEE9D", "(0x37EEE9D is prime)"),
("2ec140b", "3580dbf", "37EEE9D", ""), ("1d91b46", "190d4fc", "37EEE9D", ""),
("34e488d", "1224d24", "37EEE9D", ""),
("2a4fe2cb", "263466a9", "8000000B", "(0x8000000B is prime)"),
("5643fe94", "29a1aefa", "8000000B", ""), ("29633513", "7b007ac4", "8000000B", ""),
("2439cef5", "5c9d5a47", "8000000B", ""),
("4de3cfaa", "50dea178", "8CD626B9", "(0x8CD626B9 is prime)"),
("b8b8563", "10dbbbac", "8CD626B9", ""), ("4e8a6151", "5574ec19", "8CD626B9", ""),
("69224878", "309cfc23", "8CD626B9", ""),
("fb6f7fb6", "afb05423", "10000000F", "(0x10000000F is prime)"),
("8391a243", "26034dcd", "10000000F", ""), ("d26b98c", "14b2d6aa", "10000000F", ""),
("6b9f1371", "a21daf1d", "10000000F", ""),
(
"9f49435ad", "c8264ade8", "174876E7E9",
"0x174876E7E9 is prime (dec) 99999999977"
),
("c402da434", "1fb427acf", "174876E7E9", ""),
("f6ebc2bb1", "1096d39f2a", "174876E7E9", ""),
("153b7f7b6b", "878fda8ff", "174876E7E9", ""),
("2c1adbb8d6", "4384d2d3c6", "8000000017", "(0x8000000017 is prime)"),
("2e4f9cf5fb", "794f3443d9", "8000000017", ""),
("149e495582", "3802b8f7b7", "8000000017", ""),
("7b9d49df82", "69c68a442a", "8000000017", ""),
("683a134600", "6dd80ea9f6", "864CB9076D", "(0x864CB9076D is prime)"),
("13a870ff0d", "59b099694a", "864CB9076D", ""),
("37d06b0e63", "4d2147e46f", "864CB9076D", ""),
("661714f8f4", "22e55df507", "864CB9076D", ""),
("2f0a96363", "52693307b4", "F7F7F7F7F7", ""),
("3c85078e64", "f2275ecb6d", "F7F7F7F7F7", ""),
("352dae68d1", "707775b4c6", "F7F7F7F7F7", ""),
("37ae0f3e0b", "912113040f", "F7F7F7F7F7", ""),
("6dada15e31", "f58ed9eff7", "1000000000F", "(0x1000000000F is prime)"),
("69627a7c89", "cfb5ebd13d", "1000000000F", ""),
("a5e1ad239b", "afc030c731", "1000000000F", ""),
("f1cc45f4c5", "c64ad607c8", "1000000000F", ""),
("2ebad87d2e31", "4c72d90bca78", "800000000005", "(0x800000000005 is prime)"),
("a30b3cc50d", "29ac4fe59490", "800000000005", ""),
("33674e9647b4", "5ec7ee7e72d3", "800000000005", ""),
("3d956f474f61", "74070040257d", "800000000005", ""),
("48348e3717d6", "43fcb4399571", "800795D9BA47", "(0x800795D9BA47 is prime)"),
("5234c03cc99b", "2f3cccb87803", "800795D9BA47", ""),
("3ed13db194ab", "44b8f4ba7030", "800795D9BA47", ""),
("1c11e843bfdb", "95bd1b47b08", "800795D9BA47", ""),
("a81d11cb81fd", "1e5753a3f33d", "1000000000015", "(0x1000000000015 is prime)"),
("688c4db99232", "36fc0cf7ed", "1000000000015", ""),
("f0720cc07e07", "fc76140ed903", "1000000000015", ""),
("2ec61f8d17d1", "d270c85e36d2", "1000000000015", ""),
(
"6a24cd3ab63820", "ed4aad55e5e348", "100000000000051",
"(0x100000000000051 is prime)"
),
("e680c160d3b248", "31e0d8840ed510", "100000000000051", ""),
("a80637e9aebc38", "bb81decc4e1738", "100000000000051", ""),
("9afa5a59e9d630", "be9e65a6d42938", "100000000000051", ""),
("ab5e104eeb71c000", "2cffbd639e9fea00", "ABCDEF0123456789", ""),
("197b867547f68a00", "44b796cf94654800", "ABCDEF0123456789", ""),
("329f9483a04f2c00", "9892f76961d0f000", "ABCDEF0123456789", ""),
("4a2e12dfb4545000", "1aa3e89a69794500", "ABCDEF0123456789", ""),
(
"8b9acdf013d140f000", "12e4ceaefabdf2b2f00", "25A55A46E5DA99C71C7",
"0x25A55A46E5DA99C71C7 is the 3rd repunit prime(dec) 11111111111111111111111"
),
("1b8d960ea277e3f5500", "14418aa980e37dd000", "25A55A46E5DA99C71C7", ""),
("7314524977e8075980", "8172fa45618ccd0d80", "25A55A46E5DA99C71C7", ""),
("ca14f031769be63580", "147a2f3cf2964ca9400", "25A55A46E5DA99C71C7", ""),
(
"18532ba119d5cd0cf39735c0000", "25f9838e31634844924733000000",
"314DC643FB763F2B8C0E2DE00879",
"0x314DC643FB763F2B8C0E2DE00879 is (dec)99999999977^3"
),
(
"a56e2d2517519e3970e70c40000", "ec27428d4bb380458588fa80000",
"314DC643FB763F2B8C0E2DE00879", ""
),
(
"1cb5e8257710e8653fff33a00000", "15fdd42fe440fd3a1d121380000",
"314DC643FB763F2B8C0E2DE00879", ""
),
(
"e50d07a65fc6f93e538ce040000", "1f4b059ca609f3ce597f61240000",
"314DC643FB763F2B8C0E2DE00879", ""
),
(
"1ea3ade786a095d978d387f30df9f20000000",
"127c448575f04af5a367a7be06c7da0000000",
"47BF19662275FA2F6845C74942ED1D852E521",
"0x47BF19662275FA2F6845C74942ED1D852E521 is (dec) 99999999977^4"
),
(
"16e15b0ca82764e72e38357b1f10a20000000",
"43e2355d8514bbe22b0838fdc3983a0000000",
"47BF19662275FA2F6845C74942ED1D852E521", ""
),
(
"be39332529d93f25c3d116c004c620000000",
"5cccec42370a0a2c89c6772da801a0000000",
"47BF19662275FA2F6845C74942ED1D852E521", ""
),
(
"ecaa468d90de0eeda474d39b3e1fc0000000",
"1e714554018de6dc0fe576bfd3b5660000000",
"47BF19662275FA2F6845C74942ED1D852E521", ""
),
(
"32298816711c5dce46f9ba06e775c4bedfc770e6700000000000000",
"8ee751fd5fb24f0b4a653cb3a0c8b7d9e724574d168000000000000",
"97EDD86E4B5C4592C6D32064AC55C888A7245F07CA3CC455E07C931",
(
"0x97EDD86E4B5C4592C6D32064AC55C888A7245F07CA3CC455E07C931"
" is (dec) 99999999977^6"
)
),
(
"29213b9df3cfd15f4b428645b67b677c29d1378d810000000000000",
"6cbb732c65e10a28872394dfdd1936d5171c3c3aac0000000000000",
"97EDD86E4B5C4592C6D32064AC55C888A7245F07CA3CC455E07C931", ""
),
(
"6f18db06ad4abc52c0c50643dd13098abccd4a232f0000000000000",
"7e6bf41f2a86098ad51f98dfc10490ba3e8081bc830000000000000",
"97EDD86E4B5C4592C6D32064AC55C888A7245F07CA3CC455E07C931", ""
),
(
"62d3286cd706ad9d73caff63f1722775d7e8c731208000000000000",
"530f7ba02ae2b04c2fe3e3d27ec095925631a6c2528000000000000",
"97EDD86E4B5C4592C6D32064AC55C888A7245F07CA3CC455E07C931", ""
),
(
"a6c6503e3c031fdbf6009a89ed60582b7233c5a85de28b16000000000000000",
"75c8ed18270b583f16d442a467d32bf95c5e491e9b8523798000000000000000",
"DD15FE80B731872AC104DB37832F7E75A244AA2631BC87885B861E8F20375499",
(
"0xDD15FE80B731872AC104DB37832F7E75A244AA2631BC87885B861E8F20375499"
" is (dec) 99999999977^7"
)
),
(
"bf84d1f85cf6b51e04d2c8f4ffd03532d852053cf99b387d4000000000000000",
"397ba5a743c349f4f28bc583ecd5f06e0a25f9c6d98f09134000000000000000",
"DD15FE80B731872AC104DB37832F7E75A244AA2631BC87885B861E8F20375499", ""
),
(
"6db11c3a4152ed1a2aa6fa34b0903ec82ea1b88908dcb482000000000000000",
"ac8ac576a74ad6ca48f201bf89f77350ce86e821358d85920000000000000000",
"DD15FE80B731872AC104DB37832F7E75A244AA2631BC87885B861E8F20375499", ""
),
(
"3001d96d7fe8b733f33687646fc3017e3ac417eb32e0ec708000000000000000",
"925ddbdac4174e8321a48a32f79640e8cf7ec6f46ea235a80000000000000000",
"DD15FE80B731872AC104DB37832F7E75A244AA2631BC87885B861E8F20375499", ""
),
(
"1029048755f2e60dd98c8de6d9989226b6bb4f0db8e46bd1939de560000000000000000000",
"51bb7270b2e25cec0301a03e8275213bb6c2f6e6ec93d4d46d36ca0000000000000000000",
"141B8EBD9009F84C241879A1F680FACCED355DA36C498F73E96E880CF78EA5F96146380E41",
(
"0x141B8EBD9009F84C241879A1F680FACCED355DA36C498F73E96E880CF78EA5F96146"
"380E41 is 99999999977^8"
)
),
(
"1c5337ff982b3ad6611257dbff5bbd7a9920ba2d4f5838a0cc681ce000000000000000000",
"520c5d049ca4702031ba728591b665c4d4ccd3b2b86864d4c160fd2000000000000000000",
"141B8EBD9009F84C241879A1F680FACCED355DA36C498F73E96E880CF78EA5F96146380E41",
""
),
(
"57074dfa00e42f6555bae624b7f0209f218adf57f73ed34ab0ff90c000000000000000000",
"41eb14b6c07bfd3d1fe4f4a610c17cc44fcfcda695db040e011065000000000000000000",
"141B8EBD9009F84C241879A1F680FACCED355DA36C498F73E96E880CF78EA5F96146380E41",
""
),
(
"d8ed7feed2fe855e6997ad6397f776158573d425031bf085a615784000000000000000000",
"6f121dcd18c578ab5e229881006007bb6d319b179f11015fe958b9c000000000000000000",
"141B8EBD9009F84C241879A1F680FACCED355DA36C498F73E96E880CF78EA5F96146380E41",
""
),
(
(
"2a462b156180ea5fe550d3758c764e06fae54e626b5f503265a09df76edbdfbf"
"a1e6000000000000000000000000"
), (
"1136f41d1879fd4fb9e49e0943a46b6704d77c068ee237c3121f9071cfd3e6a0"
"0315800000000000000000000000"
), (
"2A94608DE88B6D5E9F8920F5ABB06B24CC35AE1FBACC87D075C621C3E2833EC90"
"2713E40F51E3B3C214EDFABC451"
), (
"0x2A94608DE88B6D5E9F8920F5ABB06B24CC35AE1FBACC87D075C621C3E2833EC"
"902713E40F51E3B3C214EDFABC451 is (dec) 99999999977^10"
)
),
(
(
"c1ac3800dfb3c6954dea391d206200cf3c47f795bf4a5603b4cb88ae7e574de47"
"40800000000000000000000000"
), (
"c0d16eda0549ede42fa0deb4635f7b7ce061fadea02ee4d85cba4c4f709603419"
"3c800000000000000000000000"
), (
"2A94608DE88B6D5E9F8920F5ABB06B24CC35AE1FBACC87D075C621C3E2833EC90"
"2713E40F51E3B3C214EDFABC451"
), ""
),
(
(
"19e45bb7633094d272588ad2e43bcb3ee341991c6731b6fa9d47c4018d7ce7bba"
"5ee800000000000000000000000"
), (
"1e4f83166ae59f6b9cc8fd3e7677ed8bfc01bb99c98bd3eb084246b64c1e18c33"
"65b800000000000000000000000"
), (
"2A94608DE88B6D5E9F8920F5ABB06B24CC35AE1FBACC87D075C621C3E2833EC90"
"2713E40F51E3B3C214EDFABC451"
), ""
),
(
(
"1aa93395fad5f9b7f20b8f9028a054c0bb7c11bb8520e6a95e5a34f06cb70bcdd"
"01a800000000000000000000000"
), (
"54b45afa5d4310192f8d224634242dd7dcfb342318df3d9bd37b4c614788ba13b"
"8b000000000000000000000000"
), (
"2A94608DE88B6D5E9F8920F5ABB06B24CC35AE1FBACC87D075C621C3E2833EC90"
"2713E40F51E3B3C214EDFABC451"
), ""
),
(
(
"544f2628a28cfb5ce0a1b7180ee66b49716f1d9476c466c57f0c4b23089917843"
"06d48f78686115ee19e25400000000000000000000000000000000"
), (
"677eb31ef8d66c120fa872a60cd47f6e10cbfdf94f90501bd7883cba03d185be0"
"a0148d1625745e9c4c827300000000000000000000000000000000"
), (
"8335616AED761F1F7F44E6BD49E807B82E3BF2BF11BFA6AF813C808DBF33DBFA1"
"1DABD6E6144BEF37C6800000000000000000000000000000000051"
), (
"0x8335616AED761F1F7F44E6BD49E807B82E3BF2BF11BFA6AF813C808DBF33DBF"
"A11DABD6E6144BEF37C6800000000000000000000000000000000051 is prime,"
" (dec) 10^143 + 3^4"
)
),
(
(
"76bb3470985174915e9993522aec989666908f9e8cf5cb9f037bf4aee33d8865c"
"b6464174795d07e30015b80000000000000000000000000000000"
), (
"6aaaf60d5784dcef612d133613b179a317532ecca0eed40b8ad0c01e6d4a6d8c7"
"9a52af190abd51739009a900000000000000000000000000000000"
), (
"8335616AED761F1F7F44E6BD49E807B82E3BF2BF11BFA6AF813C808DBF33DBFA1"
"1DABD6E6144BEF37C6800000000000000000000000000000000051"
), ""
),
(
(
"6cfdd6e60912e441d2d1fc88f421b533f0103a5322ccd3f4db84861643ad63fd6"
"3d1d8cfbc1d498162786ba00000000000000000000000000000000"
), (
"1177246ec5e93814816465e7f8f248b350d954439d35b2b5d75d917218e7fd5fb"
"4c2f6d0667f9467fdcf33400000000000000000000000000000000"
), (
"8335616AED761F1F7F44E6BD49E807B82E3BF2BF11BFA6AF813C808DBF33DBFA1"
"1DABD6E6144BEF37C6800000000000000000000000000000000051"
), ""
),
(
(
"7a09a0b0f8bbf8057116fb0277a9bdf3a91b5eaa8830d448081510d8973888be5"
"a9f0ad04facb69aa3715f00000000000000000000000000000000"
), (
"764dec6c05a1c0d87b649efa5fd94c91ea28bffb4725d4ab4b33f1a3e8e3b314d"
"799020e244a835a145ec9800000000000000000000000000000000"
), (
"8335616AED761F1F7F44E6BD49E807B82E3BF2BF11BFA6AF813C808DBF33DBFA1"
"1DABD6E6144BEF37C6800000000000000000000000000000000051"
), ""
)
] # type: List[Tuple[str, str, str, str]]
def __init__(
self, val_a: str, val_b: str, val_n: str, case_description: str = ""
):
self.case_description = case_description
self.arg_a = val_a
self.int_a = bignum_common.hex_to_int(val_a)
self.arg_b = val_b
self.int_b = bignum_common.hex_to_int(val_b)
self.arg_n = val_n
self.int_n = bignum_common.hex_to_int(val_n)
limbs_a4 = bignum_common.limbs_mpi(self.int_a, 32)
limbs_a8 = bignum_common.limbs_mpi(self.int_a, 64)
self.limbs_b4 = bignum_common.limbs_mpi(self.int_b, 32)
self.limbs_b8 = bignum_common.limbs_mpi(self.int_b, 64)
self.limbs_an4 = bignum_common.limbs_mpi(self.int_n, 32)
self.limbs_an8 = bignum_common.limbs_mpi(self.int_n, 64)
if limbs_a4 > self.limbs_an4 or limbs_a8 > self.limbs_an8:
raise Exception("Limbs of input A ({}) exceeds N ({})".format(
self.arg_a, self.arg_n
))
def arguments(self) -> List[str]:
return [
str(self.limbs_an4), str(self.limbs_b4),
str(self.limbs_an8), str(self.limbs_b8),
bignum_common.quote_str(self.arg_a),
bignum_common.quote_str(self.arg_b),
bignum_common.quote_str(self.arg_n)
] + self.result()
def description(self) -> str:
if self.case_description != "replay":
if not self.start_2_mpi4 and self.limbs_an4 > 1:
tmp = "(start of 2-MPI 4-byte bignums) "
self.__class__.start_2_mpi4 = True
elif not self.start_2_mpi8 and self.limbs_an8 > 1:
tmp = "(start of 2-MPI 8-byte bignums) "
self.__class__.start_2_mpi8 = True
else:
tmp = "(gen) "
self.case_description = tmp + self.case_description
return super().description()
def result(self) -> List[str]:
"""Get the result of the operation."""
r4 = bignum_common.bound_mpi_limbs(self.limbs_an4, 32)
i4 = bignum_common.invmod(r4, self.int_n)
x4 = self.int_a * self.int_b * i4
x4 = x4 % self.int_n
r8 = bignum_common.bound_mpi_limbs(self.limbs_an8, 64)
i8 = bignum_common.invmod(r8, self.int_n)
x8 = self.int_a * self.int_b * i8
x8 = x8 % self.int_n
return [
"\"{:x}\"".format(x4),
"\"{:x}\"".format(x8)
]
def set_limbs(
self, limbs_an4: int, limbs_b4: int, limbs_an8: int, limbs_b8: int
) -> None:
"""Set number of limbs for each input.
Replaces default values set during initialization.
"""
self.limbs_an4 = limbs_an4
self.limbs_b4 = limbs_b4
self.limbs_an8 = limbs_an8
self.limbs_b8 = limbs_b8
@classmethod
def generate_function_tests(cls) -> Iterator[test_case.TestCase]:
"""Generate replay and randomly generated test cases."""
# Test cases which replay captured invocations during unit test runs.
for limbs_an4, limbs_b4, limbs_an8, limbs_b8, a, b, n in cls.replay_test_cases:
cur_op = cls(a, b, n, case_description="replay")
cur_op.set_limbs(limbs_an4, limbs_b4, limbs_an8, limbs_b8)
yield cur_op.create_test_case()
# Random test cases can be generated using mpi_modmul_case_generate()
# Uses a mixture of primes and odd numbers as N, with four randomly
# generated cases for each N.
for a, b, n, description in cls.random_test_cases:
cur_op = cls(a, b, n, case_description=description)
yield cur_op.create_test_case()
def mpi_modmul_case_generate() -> None:
"""Generate valid inputs for montmul tests using moduli.
For each modulus, generates random values for A and B and simple descriptions
for the test case.
"""
moduli = [
("3", ""), ("7", ""), ("B", ""), ("29", ""), ("FF", ""),
("101", ""), ("38B", ""), ("8003", ""), ("10001", ""),
("7F7F7", ""), ("800009", ""), ("100002B", ""), ("37EEE9D", ""),
("8000000B", ""), ("8CD626B9", ""), ("10000000F", ""),
("174876E7E9", "is prime (dec) 99999999977"),
("8000000017", ""), ("864CB9076D", ""), ("F7F7F7F7F7", ""),
("1000000000F", ""), ("800000000005", ""), ("800795D9BA47", ""),
("1000000000015", ""), ("100000000000051", ""), ("ABCDEF0123456789", ""),
(
"25A55A46E5DA99C71C7",
"is the 3rd repunit prime (dec) 11111111111111111111111"
),
("314DC643FB763F2B8C0E2DE00879", "is (dec)99999999977^3"),
("47BF19662275FA2F6845C74942ED1D852E521", "is (dec) 99999999977^4"),
(
"97EDD86E4B5C4592C6D32064AC55C888A7245F07CA3CC455E07C931",
"is (dec) 99999999977^6"
),
(
"DD15FE80B731872AC104DB37832F7E75A244AA2631BC87885B861E8F20375499",
"is (dec) 99999999977^7"
),
(
"141B8EBD9009F84C241879A1F680FACCED355DA36C498F73E96E880CF78EA5F96146380E41",
"is (dec) 99999999977^8"
),
(
(
"2A94608DE88B6D5E9F8920F5ABB06B24CC35AE1FBACC87D075C621C3E283"
"3EC902713E40F51E3B3C214EDFABC451"
),
"is (dec) 99999999977^10"
),
(
"8335616AED761F1F7F44E6BD49E807B82E3BF2BF11BFA6AF813C808DBF33DBFA11"
"DABD6E6144BEF37C6800000000000000000000000000000000051",
"is prime, (dec) 10^143 + 3^4"
)
] # type: List[Tuple[str, str]]
primes = [
"3", "7", "B", "29", "101", "38B", "8003", "10001", "800009",
"100002B", "37EEE9D", "8000000B", "8CD626B9",
# From here they require > 1 4-byte MPI
"10000000F", "174876E7E9", "8000000017", "864CB9076D", "1000000000F",
"800000000005", "800795D9BA47", "1000000000015", "100000000000051",
# From here they require > 1 8-byte MPI
"25A55A46E5DA99C71C7", # this is 11111111111111111111111 decimal
# 10^143 + 3^4: (which is prime)
# 100000000000000000000000000000000000000000000000000000000000000000000000000000
# 000000000000000000000000000000000000000000000000000000000000000081
(
"8335616AED761F1F7F44E6BD49E807B82E3BF2BF11BFA6AF813C808DBF33DBFA11"
"DABD6E6144BEF37C6800000000000000000000000000000000051"
)
] # type: List[str]
generated_inputs = []
for mod, description in moduli:
n = bignum_common.hex_to_int(mod)
mod_read = "{:x}".format(n)
case_count = 3 if n < 5 else 4
cases = {} # type: Dict[int, int]
i = 0
while i < case_count:
a = random.randint(1, n)
b = random.randint(1, n)
if cases.get(a) == b:
continue
cases[a] = b
if description:
out_description = "0x{} {}".format(mod_read, description)
elif i == 0 and len(mod) > 1 and mod in primes:
out_description = "(0x{} is prime)"
else:
out_description = ""
generated_inputs.append(
("{:x}".format(a), "{:x}".format(b), mod, out_description)
)
i += 1
print(generated_inputs)
class BignumCoreExpMod(BignumCoreTarget, bignum_common.ModOperationCommon):
"""Test cases for bignum core exponentiation."""
symbol = "^"
test_function = "mpi_core_exp_mod"
test_name = "Core modular exponentiation (Mongtomery form only)"
input_style = "fixed"
montgomery_form_a = True
def result(self) -> List[str]:
# Result has to be given in Montgomery form too
result = pow(self.int_a, self.int_b, self.int_n)
mont_result = self.to_montgomery(result)
return [self.format_result(mont_result)]
@property
def is_valid(self) -> bool:
# The base needs to be canonical, but the exponent can be larger than
# the modulus (see for example exponent blinding)
return bool(self.int_a < self.int_n)
class BignumCoreSubInt(BignumCoreTarget, bignum_common.OperationCommon):
"""Test cases for bignum core sub int."""
count = 0
symbol = "-"
test_function = "mpi_core_sub_int"
test_name = "mpi_core_sub_int"
input_style = "arch_split"
@property
def is_valid(self) -> bool:
# This is "sub int", so b is only one limb
if bignum_common.limbs_mpi(self.int_b, self.bits_in_limb) > 1:
return False
return True
# Overriding because we don't want leading zeros on b
@property
def arg_b(self) -> str:
return self.val_b
def result(self) -> List[str]:
result = self.int_a - self.int_b
borrow, result = divmod(result, self.limb_boundary)
# Borrow will be -1 if non-zero, but we want it to be 1 in the test data
return [
self.format_result(result),
str(-borrow)
]
class BignumCoreZeroCheckCT(BignumCoreTarget, bignum_common.OperationCommon):
"""Test cases for bignum core zero check (constant flow)."""
count = 0
symbol = "== 0"
test_function = "mpi_core_check_zero_ct"
test_name = "mpi_core_check_zero_ct"
input_style = "variable"
arity = 1
suffix = True
def result(self) -> List[str]:
result = 1 if self.int_a == 0 else 0
return [str(result)]

View file

@ -0,0 +1,159 @@
"""Base values and datasets for bignum generated tests and helper functions that
produced them."""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import random
# Functions calling these were used to produce test data and are here only for
# reproducibility, they are not used by the test generation framework/classes
try:
from Cryptodome.Util.number import isPrime, getPrime #type: ignore #pylint: disable=import-error
except ImportError:
pass
# Generated by bignum_common.gen_safe_prime(192,1)
SAFE_PRIME_192_BIT_SEED_1 = "d1c127a667786703830500038ebaef20e5a3e2dc378fb75b"
# First number generated by random.getrandbits(192) - seed(2,2), not a prime
RANDOM_192_BIT_SEED_2_NO1 = "177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"
# Second number generated by random.getrandbits(192) - seed(2,2), not a prime
RANDOM_192_BIT_SEED_2_NO2 = "cf1822ffbc6887782b491044d5e341245c6e433715ba2bdd"
# Third number generated by random.getrandbits(192) - seed(2,2), not a prime
RANDOM_192_BIT_SEED_2_NO3 = "3653f8dd9b1f282e4067c3584ee207f8da94e3e8ab73738f"
# Fourth number generated by random.getrandbits(192) - seed(2,2), not a prime
RANDOM_192_BIT_SEED_2_NO4 = "ffed9235288bc781ae66267594c9c9500925e4749b575bd1"
# Ninth number generated by random.getrandbits(192) - seed(2,2), not a prime
RANDOM_192_BIT_SEED_2_NO9 = "2a1be9cd8697bbd0e2520e33e44c50556c71c4a66148a86f"
# Generated by bignum_common.gen_safe_prime(1024,3)
SAFE_PRIME_1024_BIT_SEED_3 = ("c93ba7ec74d96f411ba008bdb78e63ff11bb5df46a51e16b"
"2c9d156f8e4e18abf5e052cb01f47d0d1925a77f60991577"
"e128fb6f52f34a27950a594baadd3d8057abeb222cf3cca9"
"62db16abf79f2ada5bd29ab2f51244bf295eff9f6aaba130"
"2efc449b128be75eeaca04bc3c1a155d11d14e8be32a2c82"
"87b3996cf6ad5223")
# First number generated by random.getrandbits(1024) - seed(4,2), not a prime
RANDOM_1024_BIT_SEED_4_NO1 = ("6905269ed6f0b09f165c8ce36e2f24b43000de01b2ed40ed"
"3addccb2c33be0ac79d679346d4ac7a5c3902b38963dc6e8"
"534f45738d048ec0f1099c6c3e1b258fd724452ccea71ff4"
"a14876aeaff1a098ca5996666ceab360512bd13110722311"
"710cf5327ac435a7a97c643656412a9b8a1abcd1a6916c74"
"da4f9fc3c6da5d7")
# Second number generated by random.getrandbits(1024) - seed(4,2), not a prime
RANDOM_1024_BIT_SEED_4_NO2 = ("f1cfd99216df648647adec26793d0e453f5082492d83a823"
"3fb62d2c81862fc9634f806fabf4a07c566002249b191bf4"
"d8441b5616332aca5f552773e14b0190d93936e1daca3c06"
"f5ff0c03bb5d7385de08caa1a08179104a25e4664f5253a0"
"2a3187853184ff27459142deccea264542a00403ce80c4b0"
"a4042bb3d4341aad")
# Third number generated by random.getrandbits(1024) - seed(4,2), not a prime
RANDOM_1024_BIT_SEED_4_NO3 = ("14c15c910b11ad28cc21ce88d0060cc54278c2614e1bcb38"
"3bb4a570294c4ea3738d243a6e58d5ca49c7b59b995253fd"
"6c79a3de69f85e3131f3b9238224b122c3e4a892d9196ada"
"4fcfa583e1df8af9b474c7e89286a1754abcb06ae8abb93f"
"01d89a024cdce7a6d7288ff68c320f89f1347e0cdd905ecf"
"d160c5d0ef412ed6")
# Fourth number generated by random.getrandbits(1024) - seed(4,2), not a prime
RANDOM_1024_BIT_SEED_4_NO4 = ("32decd6b8efbc170a26a25c852175b7a96b98b5fbf37a2be"
"6f98bca35b17b9662f0733c846bbe9e870ef55b1a1f65507"
"a2909cb633e238b4e9dd38b869ace91311021c9e32111ac1"
"ac7cc4a4ff4dab102522d53857c49391b36cc9aa78a330a1"
"a5e333cb88dcf94384d4cd1f47ca7883ff5a52f1a05885ac"
"7671863c0bdbc23a")
# Fifth number generated by random.getrandbits(1024) - seed(4,2), not a prime
RANDOM_1024_BIT_SEED_4_NO5 = ("53be4721f5b9e1f5acdac615bc20f6264922b9ccf469aef8"
"f6e7d078e55b85dd1525f363b281b8885b69dc230af5ac87"
"0692b534758240df4a7a03052d733dcdef40af2e54c0ce68"
"1f44ebd13cc75f3edcb285f89d8cf4d4950b16ffc3e1ac3b"
"4708d9893a973000b54a23020fc5b043d6e4a51519d9c9cc"
"52d32377e78131c1")
# Adding 192 bit and 1024 bit numbers because these are the shortest required
# for ECC and RSA respectively.
INPUTS_DEFAULT = [
"0", "1", # corner cases
"2", "3", # small primes
"4", # non-prime even
"38", # small random
SAFE_PRIME_192_BIT_SEED_1, # prime
RANDOM_192_BIT_SEED_2_NO1, # not a prime
RANDOM_192_BIT_SEED_2_NO2, # not a prime
SAFE_PRIME_1024_BIT_SEED_3, # prime
RANDOM_1024_BIT_SEED_4_NO1, # not a prime
RANDOM_1024_BIT_SEED_4_NO3, # not a prime
RANDOM_1024_BIT_SEED_4_NO2, # largest (not a prime)
]
ADD_SUB_DATA = [
"0", "1", "3", "f", "fe", "ff", "100", "ff00",
"fffe", "ffff", "10000", # 2^16 - 1, 2^16, 2^16 + 1
"fffffffe", "ffffffff", "100000000", # 2^32 - 1, 2^32, 2^32 + 1
"1f7f7f7f7f7f7f",
"8000000000000000", "fefefefefefefefe",
"fffffffffffffffe", "ffffffffffffffff", "10000000000000000", # 2^64 - 1, 2^64, 2^64 + 1
"1234567890abcdef0",
"fffffffffffffffffffffffe",
"ffffffffffffffffffffffff",
"1000000000000000000000000",
"fffffffffffffffffefefefefefefefe",
"fffffffffffffffffffffffffffffffe",
"ffffffffffffffffffffffffffffffff",
"100000000000000000000000000000000",
"1234567890abcdef01234567890abcdef0",
"fffffffffffffffffffffffffffffffffffffffffffffffffefefefefefefefe",
"fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffe",
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff",
"10000000000000000000000000000000000000000000000000000000000000000",
"1234567890abcdef01234567890abcdef01234567890abcdef01234567890abcdef0",
]
# Only odd moduli are present as in the new bignum code only odd moduli are
# supported for now.
MODULI_DEFAULT = [
"53", # safe prime
"45", # non-prime
SAFE_PRIME_192_BIT_SEED_1, # safe prime
RANDOM_192_BIT_SEED_2_NO4, # not a prime
SAFE_PRIME_1024_BIT_SEED_3, # safe prime
RANDOM_1024_BIT_SEED_4_NO5, # not a prime
]
# Some functions, e.g. mbedtls_mpi_mod_raw_inv_prime(), only support prime moduli.
ONLY_PRIME_MODULI = [
"53", # safe prime
"8ac72304057392b5", # 9999999997777777333 (longer, not safe, prime)
# The next prime has a different R in Montgomery form depending on
# whether 32- or 64-bit MPIs are used.
"152d02c7e14af67fe0bf", # 99999999999999999991999
SAFE_PRIME_192_BIT_SEED_1, # safe prime
SAFE_PRIME_1024_BIT_SEED_3, # safe prime
]
def __gen_safe_prime(bits, seed):
'''
Generate a safe prime.
This function is intended for generating constants offline and shouldn't be
used in test generation classes.
Requires pycryptodomex for getPrime and isPrime and python 3.9 or later for
randbytes.
'''
rng = random.Random()
# We want reproducibility across python versions
rng.seed(seed, version=2)
while True:
prime = 2*getPrime(bits-1, rng.randbytes)+1 #pylint: disable=no-member
if isPrime(prime, 1e-30):
return prime

View file

@ -0,0 +1,102 @@
"""Framework classes for generation of bignum mod test cases."""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
from typing import Dict, List
from . import test_data_generation
from . import bignum_common
from .bignum_data import ONLY_PRIME_MODULI
class BignumModTarget(test_data_generation.BaseTarget):
#pylint: disable=abstract-method, too-few-public-methods
"""Target for bignum mod test case generation."""
target_basename = 'test_suite_bignum_mod.generated'
class BignumModMul(bignum_common.ModOperationCommon,
BignumModTarget):
# pylint:disable=duplicate-code
"""Test cases for bignum mpi_mod_mul()."""
symbol = "*"
test_function = "mpi_mod_mul"
test_name = "mbedtls_mpi_mod_mul"
input_style = "arch_split"
arity = 2
def arguments(self) -> List[str]:
return [self.format_result(self.to_montgomery(self.int_a)),
self.format_result(self.to_montgomery(self.int_b)),
bignum_common.quote_str(self.arg_n)
] + self.result()
def result(self) -> List[str]:
result = (self.int_a * self.int_b) % self.int_n
return [self.format_result(self.to_montgomery(result))]
class BignumModSub(bignum_common.ModOperationCommon, BignumModTarget):
"""Test cases for bignum mpi_mod_sub()."""
symbol = "-"
test_function = "mpi_mod_sub"
test_name = "mbedtls_mpi_mod_sub"
input_style = "fixed"
arity = 2
def result(self) -> List[str]:
result = (self.int_a - self.int_b) % self.int_n
# To make negative tests easier, append 0 for success to the
# generated cases
return [self.format_result(result), "0"]
class BignumModInvNonMont(bignum_common.ModOperationCommon, BignumModTarget):
"""Test cases for bignum mpi_mod_inv() - not in Montgomery form."""
moduli = ONLY_PRIME_MODULI # for now only prime moduli supported
symbol = "^ -1"
test_function = "mpi_mod_inv_non_mont"
test_name = "mbedtls_mpi_mod_inv non-Mont. form"
input_style = "fixed"
arity = 1
suffix = True
disallow_zero_a = True
def result(self) -> List[str]:
result = bignum_common.invmod_positive(self.int_a, self.int_n)
# To make negative tests easier, append 0 for success to the
# generated cases
return [self.format_result(result), "0"]
class BignumModInvMont(bignum_common.ModOperationCommon, BignumModTarget):
"""Test cases for bignum mpi_mod_inv() - Montgomery form."""
moduli = ONLY_PRIME_MODULI # for now only prime moduli supported
symbol = "^ -1"
test_function = "mpi_mod_inv_mont"
test_name = "mbedtls_mpi_mod_inv Mont. form"
input_style = "arch_split" # Mont. form requires arch_split
arity = 1
suffix = True
disallow_zero_a = True
montgomery_form_a = True
def result(self) -> List[str]:
result = bignum_common.invmod_positive(self.int_a, self.int_n)
mont_result = self.to_montgomery(result)
# To make negative tests easier, append 0 for success to the
# generated cases
return [self.format_result(mont_result), "0"]
class BignumModAdd(bignum_common.ModOperationCommon, BignumModTarget):
"""Test cases for bignum mpi_mod_add()."""
count = 0
symbol = "+"
test_function = "mpi_mod_add"
test_name = "mbedtls_mpi_mod_add"
input_style = "fixed"
def result(self) -> List[str]:
result = (self.int_a + self.int_b) % self.int_n
# To make negative tests easier, append "0" for success to the
# generated cases
return [self.format_result(result), "0"]

View file

@ -0,0 +1,242 @@
"""Framework classes for generation of bignum mod_raw test cases."""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
from typing import Iterator, List
from . import test_case
from . import test_data_generation
from . import bignum_common
from .bignum_data import ONLY_PRIME_MODULI
class BignumModRawTarget(test_data_generation.BaseTarget):
#pylint: disable=abstract-method, too-few-public-methods
"""Target for bignum mod_raw test case generation."""
target_basename = 'test_suite_bignum_mod_raw.generated'
class BignumModRawSub(bignum_common.ModOperationCommon,
BignumModRawTarget):
"""Test cases for bignum mpi_mod_raw_sub()."""
symbol = "-"
test_function = "mpi_mod_raw_sub"
test_name = "mbedtls_mpi_mod_raw_sub"
input_style = "fixed"
arity = 2
def arguments(self) -> List[str]:
return [bignum_common.quote_str(n) for n in [self.arg_a,
self.arg_b,
self.arg_n]
] + self.result()
def result(self) -> List[str]:
result = (self.int_a - self.int_b) % self.int_n
return [self.format_result(result)]
class BignumModRawFixQuasiReduction(bignum_common.ModOperationCommon,
BignumModRawTarget):
"""Test cases for ecp quasi_reduction()."""
symbol = "-"
test_function = "mpi_mod_raw_fix_quasi_reduction"
test_name = "fix_quasi_reduction"
input_style = "fixed"
arity = 1
# Extend the default values with n < x < 2n
input_values = bignum_common.ModOperationCommon.input_values + [
"73",
# First number generated by random.getrandbits(1024) - seed(3,2)
"ea7b5bf55eb561a4216363698b529b4a97b750923ceb3ffd",
# First number generated by random.getrandbits(1024) - seed(1,2)
("cd447e35b8b6d8fe442e3d437204e52db2221a58008a05a6c4647159c324c985"
"9b810e766ec9d28663ca828dd5f4b3b2e4b06ce60741c7a87ce42c8218072e8c"
"35bf992dc9e9c616612e7696a6cecc1b78e510617311d8a3c2ce6f447ed4d57b"
"1e2feb89414c343c1027c4d1c386bbc4cd613e30d8f16adf91b7584a2265b1f5")
] # type: List[str]
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return bool(self.int_a < 2 * self.int_n)
class BignumModRawMul(bignum_common.ModOperationCommon,
BignumModRawTarget):
"""Test cases for bignum mpi_mod_raw_mul()."""
symbol = "*"
test_function = "mpi_mod_raw_mul"
test_name = "mbedtls_mpi_mod_raw_mul"
input_style = "arch_split"
arity = 2
def arguments(self) -> List[str]:
return [self.format_result(self.to_montgomery(self.int_a)),
self.format_result(self.to_montgomery(self.int_b)),
bignum_common.quote_str(self.arg_n)
] + self.result()
def result(self) -> List[str]:
result = (self.int_a * self.int_b) % self.int_n
return [self.format_result(self.to_montgomery(result))]
class BignumModRawInvPrime(bignum_common.ModOperationCommon,
BignumModRawTarget):
"""Test cases for bignum mpi_mod_raw_inv_prime()."""
moduli = ONLY_PRIME_MODULI
symbol = "^ -1"
test_function = "mpi_mod_raw_inv_prime"
test_name = "mbedtls_mpi_mod_raw_inv_prime (Montgomery form only)"
input_style = "arch_split"
arity = 1
suffix = True
montgomery_form_a = True
disallow_zero_a = True
def result(self) -> List[str]:
result = bignum_common.invmod_positive(self.int_a, self.int_n)
mont_result = self.to_montgomery(result)
return [self.format_result(mont_result)]
class BignumModRawAdd(bignum_common.ModOperationCommon,
BignumModRawTarget):
"""Test cases for bignum mpi_mod_raw_add()."""
symbol = "+"
test_function = "mpi_mod_raw_add"
test_name = "mbedtls_mpi_mod_raw_add"
input_style = "fixed"
arity = 2
def result(self) -> List[str]:
result = (self.int_a + self.int_b) % self.int_n
return [self.format_result(result)]
class BignumModRawConvertRep(bignum_common.ModOperationCommon,
BignumModRawTarget):
# This is an abstract class, it's ok to have unimplemented methods.
#pylint: disable=abstract-method
"""Test cases for representation conversion."""
symbol = ""
input_style = "arch_split"
arity = 1
rep = bignum_common.ModulusRepresentation.INVALID
def set_representation(self, r: bignum_common.ModulusRepresentation) -> None:
self.rep = r
def arguments(self) -> List[str]:
return ([bignum_common.quote_str(self.arg_n), self.rep.symbol(),
bignum_common.quote_str(self.arg_a)] +
self.result())
def description(self) -> str:
base = super().description()
mod_with_rep = 'mod({})'.format(self.rep.name)
return base.replace('mod', mod_with_rep, 1)
@classmethod
def test_cases_for_values(cls, rep: bignum_common.ModulusRepresentation,
n: str, a: str) -> Iterator[test_case.TestCase]:
"""Emit test cases for the given values (if any).
This may emit no test cases if a isn't valid for the modulus n,
or multiple test cases if rep requires different data depending
on the limb size.
"""
for bil in cls.limb_sizes:
test_object = cls(n, a, bits_in_limb=bil)
test_object.set_representation(rep)
# The class is set to having separate test cases for each limb
# size, because the Montgomery representation requires it.
# But other representations don't require it. So for other
# representations, emit a single test case with no dependency
# on the limb size.
if rep is not bignum_common.ModulusRepresentation.MONTGOMERY:
test_object.dependencies = \
[dep for dep in test_object.dependencies
if not dep.startswith('MBEDTLS_HAVE_INT')]
if test_object.is_valid:
yield test_object.create_test_case()
if rep is not bignum_common.ModulusRepresentation.MONTGOMERY:
# A single test case (emitted, or skipped due to invalidity)
# is enough, since this test case doesn't depend on the
# limb size.
break
# The parent class doesn't support non-bignum parameters. So we override
# test generation, in order to have the representation as a parameter.
@classmethod
def generate_function_tests(cls) -> Iterator[test_case.TestCase]:
for rep in bignum_common.ModulusRepresentation.supported_representations():
for n in cls.moduli:
for a in cls.input_values:
yield from cls.test_cases_for_values(rep, n, a)
class BignumModRawCanonicalToModulusRep(BignumModRawConvertRep):
"""Test cases for mpi_mod_raw_canonical_to_modulus_rep."""
test_function = "mpi_mod_raw_canonical_to_modulus_rep"
test_name = "Rep canon->mod"
def result(self) -> List[str]:
return [self.format_result(self.convert_from_canonical(self.int_a, self.rep))]
class BignumModRawModulusToCanonicalRep(BignumModRawConvertRep):
"""Test cases for mpi_mod_raw_modulus_to_canonical_rep."""
test_function = "mpi_mod_raw_modulus_to_canonical_rep"
test_name = "Rep mod->canon"
@property
def arg_a(self) -> str:
return self.format_arg("{:x}".format(self.convert_from_canonical(self.int_a, self.rep)))
def result(self) -> List[str]:
return [self.format_result(self.int_a)]
class BignumModRawConvertToMont(bignum_common.ModOperationCommon,
BignumModRawTarget):
""" Test cases for mpi_mod_raw_to_mont_rep(). """
test_function = "mpi_mod_raw_to_mont_rep"
test_name = "Convert into Mont: "
symbol = "R *"
input_style = "arch_split"
arity = 1
def result(self) -> List[str]:
result = self.to_montgomery(self.int_a)
return [self.format_result(result)]
class BignumModRawConvertFromMont(bignum_common.ModOperationCommon,
BignumModRawTarget):
""" Test cases for mpi_mod_raw_from_mont_rep(). """
test_function = "mpi_mod_raw_from_mont_rep"
test_name = "Convert from Mont: "
symbol = "1/R *"
input_style = "arch_split"
arity = 1
def result(self) -> List[str]:
result = self.from_montgomery(self.int_a)
return [self.format_result(result)]
class BignumModRawModNegate(bignum_common.ModOperationCommon,
BignumModRawTarget):
""" Test cases for mpi_mod_raw_neg(). """
test_function = "mpi_mod_raw_neg"
test_name = "Modular negation: "
symbol = "-"
input_style = "arch_split"
arity = 1
def result(self) -> List[str]:
result = (self.int_n - self.int_a) % self.int_n
return [self.format_result(result)]

View file

@ -0,0 +1,120 @@
"""Mbed TLS build tree information and manipulation.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import os
import inspect
from typing import Optional
def looks_like_tf_psa_crypto_root(path: str) -> bool:
"""Whether the given directory looks like the root of the PSA Crypto source tree."""
return all(os.path.isdir(os.path.join(path, subdir))
for subdir in ['include', 'core', 'drivers', 'programs', 'tests'])
def looks_like_mbedtls_root(path: str) -> bool:
"""Whether the given directory looks like the root of the Mbed TLS source tree."""
return all(os.path.isdir(os.path.join(path, subdir))
for subdir in ['include', 'library', 'programs', 'tests'])
def looks_like_root(path: str) -> bool:
return looks_like_tf_psa_crypto_root(path) or looks_like_mbedtls_root(path)
def crypto_core_directory(root: Optional[str] = None, relative: Optional[bool] = False) -> str:
"""
Return the path of the directory containing the PSA crypto core
for either TF-PSA-Crypto or Mbed TLS.
Returns either the full path or relative path depending on the
"relative" boolean argument.
"""
if root is None:
root = guess_project_root()
if looks_like_tf_psa_crypto_root(root):
if relative:
return "core"
return os.path.join(root, "core")
elif looks_like_mbedtls_root(root):
if relative:
return "library"
return os.path.join(root, "library")
else:
raise Exception('Neither Mbed TLS nor TF-PSA-Crypto source tree found')
def crypto_library_filename(root: Optional[str] = None) -> str:
"""Return the crypto library filename for either TF-PSA-Crypto or Mbed TLS."""
if root is None:
root = guess_project_root()
if looks_like_tf_psa_crypto_root(root):
return "tfpsacrypto"
elif looks_like_mbedtls_root(root):
return "mbedcrypto"
else:
raise Exception('Neither Mbed TLS nor TF-PSA-Crypto source tree found')
def check_repo_path():
"""Check that the current working directory is the project root, and throw
an exception if not.
"""
if not all(os.path.isdir(d) for d in ["include", "library", "tests"]):
raise Exception("This script must be run from Mbed TLS root")
def chdir_to_root() -> None:
"""Detect the root of the Mbed TLS source tree and change to it.
The current directory must be up to two levels deep inside an Mbed TLS
source tree.
"""
for d in [os.path.curdir,
os.path.pardir,
os.path.join(os.path.pardir, os.path.pardir)]:
if looks_like_root(d):
os.chdir(d)
return
raise Exception('Mbed TLS source tree not found')
def guess_project_root():
"""Guess project source code directory.
Return the first possible project root directory.
"""
dirs = set({})
for frame in inspect.stack():
path = os.path.dirname(frame.filename)
for d in ['.', os.path.pardir] \
+ [os.path.join(*([os.path.pardir]*i)) for i in range(2, 10)]:
d = os.path.abspath(os.path.join(path, d))
if d in dirs:
continue
dirs.add(d)
if looks_like_root(d):
return d
raise Exception('Neither Mbed TLS nor TF-PSA-Crypto source tree found')
def guess_mbedtls_root(root: Optional[str] = None) -> str:
"""Guess Mbed TLS source code directory.
Return the first possible Mbed TLS root directory.
Raise an exception if we are not in Mbed TLS.
"""
if root is None:
root = guess_project_root()
if looks_like_mbedtls_root(root):
return root
else:
raise Exception('Mbed TLS source tree not found')
def guess_tf_psa_crypto_root(root: Optional[str] = None) -> str:
"""Guess TF-PSA-Crypto source code directory.
Return the first possible TF-PSA-Crypto root directory.
Raise an exception if we are not in TF-PSA-Crypto.
"""
if root is None:
root = guess_project_root()
if looks_like_tf_psa_crypto_root(root):
return root
else:
raise Exception('TF-PSA-Crypto source tree not found')

View file

@ -0,0 +1,162 @@
"""Generate and run C code.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import os
import platform
import subprocess
import sys
import tempfile
def remove_file_if_exists(filename):
"""Remove the specified file, ignoring errors."""
if not filename:
return
try:
os.remove(filename)
except OSError:
pass
def create_c_file(file_label):
"""Create a temporary C file.
* ``file_label``: a string that will be included in the file name.
Return ```(c_file, c_name, exe_name)``` where ``c_file`` is a Python
stream open for writing to the file, ``c_name`` is the name of the file
and ``exe_name`` is the name of the executable that will be produced
by compiling the file.
"""
c_fd, c_name = tempfile.mkstemp(prefix='tmp-{}-'.format(file_label),
suffix='.c')
exe_suffix = '.exe' if platform.system() == 'Windows' else ''
exe_name = c_name[:-2] + exe_suffix
remove_file_if_exists(exe_name)
c_file = os.fdopen(c_fd, 'w', encoding='ascii')
return c_file, c_name, exe_name
def generate_c_printf_expressions(c_file, cast_to, printf_format, expressions):
"""Generate C instructions to print the value of ``expressions``.
Write the code with ``c_file``'s ``write`` method.
Each expression is cast to the type ``cast_to`` and printed with the
printf format ``printf_format``.
"""
for expr in expressions:
c_file.write(' printf("{}\\n", ({}) {});\n'
.format(printf_format, cast_to, expr))
def generate_c_file(c_file,
caller, header,
main_generator):
"""Generate a temporary C source file.
* ``c_file`` is an open stream on the C source file.
* ``caller``: an informational string written in a comment at the top
of the file.
* ``header``: extra code to insert before any function in the generated
C file.
* ``main_generator``: a function called with ``c_file`` as its sole argument
to generate the body of the ``main()`` function.
"""
c_file.write('/* Generated by {} */'
.format(caller))
c_file.write('''
#include <stdio.h>
''')
c_file.write(header)
c_file.write('''
int main(void)
{
''')
main_generator(c_file)
c_file.write(''' return 0;
}
''')
def compile_c_file(c_filename, exe_filename, include_dirs):
"""Compile a C source file with the host compiler.
* ``c_filename``: the name of the source file to compile.
* ``exe_filename``: the name for the executable to be created.
* ``include_dirs``: a list of paths to include directories to be passed
with the -I switch.
"""
# Respect $HOSTCC if it is set
cc = os.getenv('HOSTCC', None)
if cc is None:
cc = os.getenv('CC', 'cc')
cmd = [cc]
proc = subprocess.Popen(cmd,
stdout=subprocess.DEVNULL,
stderr=subprocess.PIPE,
universal_newlines=True)
cc_is_msvc = 'Microsoft (R) C/C++' in proc.communicate()[1]
cmd += ['-I' + dir for dir in include_dirs]
if cc_is_msvc:
# MSVC has deprecated using -o to specify the output file,
# and produces an object file in the working directory by default.
obj_filename = exe_filename[:-4] + '.obj'
cmd += ['-Fe' + exe_filename, '-Fo' + obj_filename]
else:
cmd += ['-o' + exe_filename]
subprocess.check_call(cmd + [c_filename])
def get_c_expression_values(
cast_to, printf_format,
expressions,
caller=__name__, file_label='',
header='', include_path=None,
keep_c=False,
): # pylint: disable=too-many-arguments, too-many-locals
"""Generate and run a program to print out numerical values for expressions.
* ``cast_to``: a C type.
* ``printf_format``: a printf format suitable for the type ``cast_to``.
* ``header``: extra code to insert before any function in the generated
C file.
* ``expressions``: a list of C language expressions that have the type
``cast_to``.
* ``include_path``: a list of directories containing header files.
* ``keep_c``: if true, keep the temporary C file (presumably for debugging
purposes).
Use the C compiler specified by the ``CC`` environment variable, defaulting
to ``cc``. If ``CC`` looks like MSVC, use its command line syntax,
otherwise assume the compiler supports Unix traditional ``-I`` and ``-o``.
Return the list of values of the ``expressions``.
"""
if include_path is None:
include_path = []
c_name = None
exe_name = None
obj_name = None
try:
c_file, c_name, exe_name = create_c_file(file_label)
generate_c_file(
c_file, caller, header,
lambda c_file: generate_c_printf_expressions(c_file,
cast_to, printf_format,
expressions)
)
c_file.close()
compile_c_file(c_name, exe_name, include_path)
if keep_c:
sys.stderr.write('List of {} tests kept at {}\n'
.format(caller, c_name))
else:
os.remove(c_name)
output = subprocess.check_output([exe_name])
return output.decode('ascii').strip().split('\n')
finally:
remove_file_if_exists(exe_name)
remove_file_if_exists(obj_name)

View file

@ -0,0 +1,112 @@
"""Generate test data for cryptographic mechanisms.
This module is a work in progress, only implementing a few cases for now.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import hashlib
from typing import Callable, Dict, Iterator, List, Optional #pylint: disable=unused-import
from . import crypto_knowledge
from . import psa_information
from . import test_case
def psa_low_level_dependencies(*expressions: str) -> List[str]:
"""Infer dependencies of a PSA low-level test case by looking for PSA_xxx symbols.
This function generates MBEDTLS_PSA_BUILTIN_xxx symbols.
"""
high_level = psa_information.automatic_dependencies(*expressions)
for dep in high_level:
assert dep.startswith('PSA_WANT_')
return ['MBEDTLS_PSA_BUILTIN_' + dep[9:] for dep in high_level]
class HashPSALowLevel:
"""Generate test cases for the PSA low-level hash interface."""
def __init__(self, info: psa_information.Information) -> None:
self.info = info
base_algorithms = sorted(info.constructors.algorithms)
all_algorithms = \
[crypto_knowledge.Algorithm(expr)
for expr in info.constructors.generate_expressions(base_algorithms)]
self.algorithms = \
[alg
for alg in all_algorithms
if (not alg.is_wildcard and
alg.can_do(crypto_knowledge.AlgorithmCategory.HASH))]
# CALCULATE[alg] = function to return the hash of its argument in hex
# TO-DO: implement the None entries with a third-party library, because
# hashlib might not have everything, depending on the Python version and
# the underlying OpenSSL. On Ubuntu 16.04, truncated sha512 and sha3/shake
# are not available. On Ubuntu 22.04, md2, md4 and ripemd160 are not
# available.
CALCULATE = {
'PSA_ALG_MD5': lambda data: hashlib.md5(data).hexdigest(),
'PSA_ALG_RIPEMD160': None, #lambda data: hashlib.new('ripdemd160').hexdigest()
'PSA_ALG_SHA_1': lambda data: hashlib.sha1(data).hexdigest(),
'PSA_ALG_SHA_224': lambda data: hashlib.sha224(data).hexdigest(),
'PSA_ALG_SHA_256': lambda data: hashlib.sha256(data).hexdigest(),
'PSA_ALG_SHA_384': lambda data: hashlib.sha384(data).hexdigest(),
'PSA_ALG_SHA_512': lambda data: hashlib.sha512(data).hexdigest(),
'PSA_ALG_SHA_512_224': None, #lambda data: hashlib.new('sha512_224').hexdigest()
'PSA_ALG_SHA_512_256': None, #lambda data: hashlib.new('sha512_256').hexdigest()
'PSA_ALG_SHA3_224': None, #lambda data: hashlib.sha3_224(data).hexdigest(),
'PSA_ALG_SHA3_256': None, #lambda data: hashlib.sha3_256(data).hexdigest(),
'PSA_ALG_SHA3_384': None, #lambda data: hashlib.sha3_384(data).hexdigest(),
'PSA_ALG_SHA3_512': None, #lambda data: hashlib.sha3_512(data).hexdigest(),
'PSA_ALG_SHAKE256_512': None, #lambda data: hashlib.shake_256(data).hexdigest(64),
} #type: Dict[str, Optional[Callable[[bytes], str]]]
@staticmethod
def one_test_case(alg: crypto_knowledge.Algorithm,
function: str, note: str,
arguments: List[str]) -> test_case.TestCase:
"""Construct one test case involving a hash."""
tc = test_case.TestCase()
tc.set_description('{}{} {}'
.format(function,
' ' + note if note else '',
alg.short_expression()))
tc.set_dependencies(psa_low_level_dependencies(alg.expression))
tc.set_function(function)
tc.set_arguments([alg.expression] +
['"{}"'.format(arg) for arg in arguments])
return tc
def test_cases_for_hash(self,
alg: crypto_knowledge.Algorithm
) -> Iterator[test_case.TestCase]:
"""Enumerate all test cases for one hash algorithm."""
calc = self.CALCULATE[alg.expression]
if calc is None:
return # not implemented yet
short = b'abc'
hash_short = calc(short)
long = (b'Hello, world. Here are 16 unprintable bytes: ['
b'\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a'
b'\x80\x81\x82\x83\xfe\xff]. '
b' This message was brought to you by a natural intelligence. '
b' If you can read this, good luck with your debugging!')
hash_long = calc(long)
yield self.one_test_case(alg, 'hash_empty', '', [calc(b'')])
yield self.one_test_case(alg, 'hash_valid_one_shot', '',
[short.hex(), hash_short])
for n in [0, 1, 64, len(long) - 1, len(long)]:
yield self.one_test_case(alg, 'hash_valid_multipart',
'{} + {}'.format(n, len(long) - n),
[long[:n].hex(), calc(long[:n]),
long[n:].hex(), hash_long])
def all_test_cases(self) -> Iterator[test_case.TestCase]:
"""Enumerate all test cases for all hash algorithms."""
for alg in self.algorithms:
yield from self.test_cases_for_hash(alg)

View file

@ -0,0 +1,568 @@
"""Knowledge about cryptographic mechanisms implemented in Mbed TLS.
This module is entirely based on the PSA API.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import enum
import re
from typing import FrozenSet, Iterable, List, Optional, Tuple, Dict
from .asymmetric_key_data import ASYMMETRIC_KEY_DATA
def short_expression(original: str, level: int = 0) -> str:
"""Abbreviate the expression, keeping it human-readable.
If `level` is 0, just remove parts that are implicit from context,
such as a leading ``PSA_KEY_TYPE_``.
For larger values of `level`, also abbreviate some names in an
unambiguous, but ad hoc way.
"""
short = original
short = re.sub(r'\bPSA_(?:ALG|DH_FAMILY|ECC_FAMILY|KEY_[A-Z]+)_', r'', short)
short = re.sub(r' +', r'', short)
if level >= 1:
short = re.sub(r'PUBLIC_KEY\b', r'PUB', short)
short = re.sub(r'KEY_PAIR\b', r'PAIR', short)
short = re.sub(r'\bBRAINPOOL_P', r'BP', short)
short = re.sub(r'\bMONTGOMERY\b', r'MGM', short)
short = re.sub(r'AEAD_WITH_SHORTENED_TAG\b', r'AEAD_SHORT', short)
short = re.sub(r'\bDETERMINISTIC_', r'DET_', short)
short = re.sub(r'\bKEY_AGREEMENT\b', r'KA', short)
short = re.sub(r'_PSK_TO_MS\b', r'_PSK2MS', short)
return short
BLOCK_CIPHERS = frozenset(['AES', 'ARIA', 'CAMELLIA', 'DES'])
BLOCK_MAC_MODES = frozenset(['CBC_MAC', 'CMAC'])
BLOCK_CIPHER_MODES = frozenset([
'CTR', 'CFB', 'OFB', 'XTS', 'CCM_STAR_NO_TAG',
'ECB_NO_PADDING', 'CBC_NO_PADDING', 'CBC_PKCS7',
])
BLOCK_AEAD_MODES = frozenset(['CCM', 'GCM'])
class EllipticCurveCategory(enum.Enum):
"""Categorization of elliptic curve families.
The category of a curve determines what algorithms are defined over it.
"""
SHORT_WEIERSTRASS = 0
MONTGOMERY = 1
TWISTED_EDWARDS = 2
@staticmethod
def from_family(family: str) -> 'EllipticCurveCategory':
if family == 'PSA_ECC_FAMILY_MONTGOMERY':
return EllipticCurveCategory.MONTGOMERY
if family == 'PSA_ECC_FAMILY_TWISTED_EDWARDS':
return EllipticCurveCategory.TWISTED_EDWARDS
# Default to SW, which most curves belong to.
return EllipticCurveCategory.SHORT_WEIERSTRASS
class KeyType:
"""Knowledge about a PSA key type."""
def __init__(self, name: str, params: Optional[Iterable[str]] = None) -> None:
"""Analyze a key type.
The key type must be specified in PSA syntax. In its simplest form,
`name` is a string 'PSA_KEY_TYPE_xxx' which is the name of a PSA key
type macro. For key types that take arguments, the arguments can
be passed either through the optional argument `params` or by
passing an expression of the form 'PSA_KEY_TYPE_xxx(param1, ...)'
in `name` as a string.
"""
self.name = name.strip()
"""The key type macro name (``PSA_KEY_TYPE_xxx``).
For key types constructed from a macro with arguments, this is the
name of the macro, and the arguments are in `self.params`.
"""
if params is None:
if '(' in self.name:
m = re.match(r'(\w+)\s*\((.*)\)\Z', self.name)
assert m is not None
self.name = m.group(1)
params = m.group(2).split(',')
self.params = (None if params is None else
[param.strip() for param in params])
"""The parameters of the key type, if there are any.
None if the key type is a macro without arguments.
"""
assert re.match(r'PSA_KEY_TYPE_\w+\Z', self.name)
self.expression = self.name
"""A C expression whose value is the key type encoding."""
if self.params is not None:
self.expression += '(' + ', '.join(self.params) + ')'
m = re.match(r'PSA_KEY_TYPE_(\w+)', self.name)
assert m
self.head = re.sub(r'_(?:PUBLIC_KEY|KEY_PAIR)\Z', r'', m.group(1))
"""The key type macro name, with common prefixes and suffixes stripped."""
self.private_type = re.sub(r'_PUBLIC_KEY\Z', r'_KEY_PAIR', self.name)
"""The key type macro name for the corresponding key pair type.
For everything other than a public key type, this is the same as
`self.name`.
"""
def short_expression(self, level: int = 0) -> str:
"""Abbreviate the expression, keeping it human-readable.
See `crypto_knowledge.short_expression`.
"""
return short_expression(self.expression, level=level)
def is_public(self) -> bool:
"""Whether the key type is for public keys."""
return self.name.endswith('_PUBLIC_KEY')
DH_KEY_SIZES = {
'PSA_DH_FAMILY_RFC7919': (2048, 3072, 4096, 6144, 8192),
} # type: Dict[str, Tuple[int, ...]]
ECC_KEY_SIZES = {
'PSA_ECC_FAMILY_SECP_K1': (192, 225, 256),
'PSA_ECC_FAMILY_SECP_R1': (224, 256, 384, 521),
'PSA_ECC_FAMILY_SECP_R2': (160,),
'PSA_ECC_FAMILY_SECT_K1': (163, 233, 239, 283, 409, 571),
'PSA_ECC_FAMILY_SECT_R1': (163, 233, 283, 409, 571),
'PSA_ECC_FAMILY_SECT_R2': (163,),
'PSA_ECC_FAMILY_BRAINPOOL_P_R1': (160, 192, 224, 256, 320, 384, 512),
'PSA_ECC_FAMILY_MONTGOMERY': (255, 448),
'PSA_ECC_FAMILY_TWISTED_EDWARDS': (255, 448),
} # type: Dict[str, Tuple[int, ...]]
KEY_TYPE_SIZES = {
'PSA_KEY_TYPE_AES': (128, 192, 256), # exhaustive
'PSA_KEY_TYPE_ARIA': (128, 192, 256), # exhaustive
'PSA_KEY_TYPE_CAMELLIA': (128, 192, 256), # exhaustive
'PSA_KEY_TYPE_CHACHA20': (256,), # exhaustive
'PSA_KEY_TYPE_DERIVE': (120, 128), # sample
'PSA_KEY_TYPE_DES': (64, 128, 192), # exhaustive
'PSA_KEY_TYPE_HMAC': (128, 160, 224, 256, 384, 512), # standard size for each supported hash
'PSA_KEY_TYPE_PASSWORD': (48, 168, 336), # sample
'PSA_KEY_TYPE_PASSWORD_HASH': (128, 256), # sample
'PSA_KEY_TYPE_PEPPER': (128, 256), # sample
'PSA_KEY_TYPE_RAW_DATA': (8, 40, 128), # sample
'PSA_KEY_TYPE_RSA_KEY_PAIR': (1024, 1536), # small sample
} # type: Dict[str, Tuple[int, ...]]
def sizes_to_test(self) -> Tuple[int, ...]:
"""Return a tuple of key sizes to test.
For key types that only allow a single size, or only a small set of
sizes, these are all the possible sizes. For key types that allow a
wide range of sizes, these are a representative sample of sizes,
excluding large sizes for which a typical resource-constrained platform
may run out of memory.
"""
if self.private_type == 'PSA_KEY_TYPE_ECC_KEY_PAIR':
assert self.params is not None
return self.ECC_KEY_SIZES[self.params[0]]
if self.private_type == 'PSA_KEY_TYPE_DH_KEY_PAIR':
assert self.params is not None
return self.DH_KEY_SIZES[self.params[0]]
return self.KEY_TYPE_SIZES[self.private_type]
# "48657265006973206b6579a064617461"
DATA_BLOCK = b'Here\000is key\240data'
def key_material(self, bits: int) -> bytes:
"""Return a byte string containing suitable key material with the given bit length.
Use the PSA export representation. The resulting byte string is one that
can be obtained with the following code:
```
psa_set_key_type(&attributes, `self.expression`);
psa_set_key_bits(&attributes, `bits`);
psa_set_key_usage_flags(&attributes, PSA_KEY_USAGE_EXPORT);
psa_generate_key(&attributes, &id);
psa_export_key(id, `material`, ...);
```
"""
if self.expression in ASYMMETRIC_KEY_DATA:
if bits not in ASYMMETRIC_KEY_DATA[self.expression]:
raise ValueError('No key data for {}-bit {}'
.format(bits, self.expression))
return ASYMMETRIC_KEY_DATA[self.expression][bits]
if bits % 8 != 0:
raise ValueError('Non-integer number of bytes: {} bits for {}'
.format(bits, self.expression))
length = bits // 8
if self.name == 'PSA_KEY_TYPE_DES':
# "644573206b457901644573206b457902644573206b457904"
des3 = b'dEs kEy\001dEs kEy\002dEs kEy\004'
return des3[:length]
return b''.join([self.DATA_BLOCK] * (length // len(self.DATA_BLOCK)) +
[self.DATA_BLOCK[:length % len(self.DATA_BLOCK)]])
def can_do(self, alg: 'Algorithm') -> bool:
"""Whether this key type can be used for operations with the given algorithm.
This function does not currently handle key derivation or PAKE.
"""
#pylint: disable=too-many-branches,too-many-return-statements
if not alg.is_valid_for_operation():
return False
if self.head == 'HMAC' and alg.head == 'HMAC':
return True
if self.head == 'DES':
# 64-bit block ciphers only allow a reduced set of modes.
return alg.head in [
'CBC_NO_PADDING', 'CBC_PKCS7',
'ECB_NO_PADDING',
]
if self.head in BLOCK_CIPHERS and \
alg.head in frozenset.union(BLOCK_MAC_MODES,
BLOCK_CIPHER_MODES,
BLOCK_AEAD_MODES):
if alg.head in ['CMAC', 'OFB'] and \
self.head in ['ARIA', 'CAMELLIA']:
return False # not implemented in Mbed TLS
return True
if self.head == 'CHACHA20' and alg.head == 'CHACHA20_POLY1305':
return True
if self.head in {'ARC4', 'CHACHA20'} and \
alg.head == 'STREAM_CIPHER':
return True
if self.head == 'RSA' and alg.head.startswith('RSA_'):
return True
if alg.category == AlgorithmCategory.KEY_AGREEMENT and \
self.is_public():
# The PSA API does not use public key objects in key agreement
# operations: it imports the public key as a formatted byte string.
# So a public key object with a key agreement algorithm is not
# a valid combination.
return False
if alg.is_invalid_key_agreement_with_derivation():
return False
if self.head == 'ECC':
assert self.params is not None
eccc = EllipticCurveCategory.from_family(self.params[0])
if alg.head == 'ECDH' and \
eccc in {EllipticCurveCategory.SHORT_WEIERSTRASS,
EllipticCurveCategory.MONTGOMERY}:
return True
if alg.head == 'ECDSA' and \
eccc == EllipticCurveCategory.SHORT_WEIERSTRASS:
return True
if alg.head in {'PURE_EDDSA', 'EDDSA_PREHASH'} and \
eccc == EllipticCurveCategory.TWISTED_EDWARDS:
return True
if self.head == 'DH' and alg.head == 'FFDH':
return True
return False
class AlgorithmCategory(enum.Enum):
"""PSA algorithm categories."""
# The numbers are aligned with the category bits in numerical values of
# algorithms.
HASH = 2
MAC = 3
CIPHER = 4
AEAD = 5
SIGN = 6
ASYMMETRIC_ENCRYPTION = 7
KEY_DERIVATION = 8
KEY_AGREEMENT = 9
PAKE = 10
def requires_key(self) -> bool:
"""Whether operations in this category are set up with a key."""
return self not in {self.HASH, self.KEY_DERIVATION}
def is_asymmetric(self) -> bool:
"""Whether operations in this category involve asymmetric keys."""
return self in {
self.SIGN,
self.ASYMMETRIC_ENCRYPTION,
self.KEY_AGREEMENT
}
class AlgorithmNotRecognized(Exception):
def __init__(self, expr: str) -> None:
super().__init__('Algorithm not recognized: ' + expr)
self.expr = expr
class Algorithm:
"""Knowledge about a PSA algorithm."""
@staticmethod
def determine_base(expr: str) -> str:
"""Return an expression for the "base" of the algorithm.
This strips off variants of algorithms such as MAC truncation.
This function does not attempt to detect invalid inputs.
"""
m = re.match(r'PSA_ALG_(?:'
r'(?:TRUNCATED|AT_LEAST_THIS_LENGTH)_MAC|'
r'AEAD_WITH_(?:SHORTENED|AT_LEAST_THIS_LENGTH)_TAG'
r')\((.*),[^,]+\)\Z', expr)
if m:
expr = m.group(1)
return expr
@staticmethod
def determine_head(expr: str) -> str:
"""Return the head of an algorithm expression.
The head is the first (outermost) constructor, without its PSA_ALG_
prefix, and with some normalization of similar algorithms.
"""
m = re.match(r'PSA_ALG_(?:DETERMINISTIC_)?(\w+)', expr)
if not m:
raise AlgorithmNotRecognized(expr)
head = m.group(1)
if head == 'KEY_AGREEMENT':
m = re.match(r'PSA_ALG_KEY_AGREEMENT\s*\(\s*PSA_ALG_(\w+)', expr)
if not m:
raise AlgorithmNotRecognized(expr)
head = m.group(1)
head = re.sub(r'_ANY\Z', r'', head)
if re.match(r'ED[0-9]+PH\Z', head):
head = 'EDDSA_PREHASH'
return head
CATEGORY_FROM_HEAD = {
'SHA': AlgorithmCategory.HASH,
'SHAKE256_512': AlgorithmCategory.HASH,
'MD': AlgorithmCategory.HASH,
'RIPEMD': AlgorithmCategory.HASH,
'ANY_HASH': AlgorithmCategory.HASH,
'HMAC': AlgorithmCategory.MAC,
'STREAM_CIPHER': AlgorithmCategory.CIPHER,
'CHACHA20_POLY1305': AlgorithmCategory.AEAD,
'DSA': AlgorithmCategory.SIGN,
'ECDSA': AlgorithmCategory.SIGN,
'EDDSA': AlgorithmCategory.SIGN,
'PURE_EDDSA': AlgorithmCategory.SIGN,
'RSA_PSS': AlgorithmCategory.SIGN,
'RSA_PKCS1V15_SIGN': AlgorithmCategory.SIGN,
'RSA_PKCS1V15_CRYPT': AlgorithmCategory.ASYMMETRIC_ENCRYPTION,
'RSA_OAEP': AlgorithmCategory.ASYMMETRIC_ENCRYPTION,
'HKDF': AlgorithmCategory.KEY_DERIVATION,
'TLS12_PRF': AlgorithmCategory.KEY_DERIVATION,
'TLS12_PSK_TO_MS': AlgorithmCategory.KEY_DERIVATION,
'TLS12_ECJPAKE_TO_PMS': AlgorithmCategory.KEY_DERIVATION,
'PBKDF': AlgorithmCategory.KEY_DERIVATION,
'ECDH': AlgorithmCategory.KEY_AGREEMENT,
'FFDH': AlgorithmCategory.KEY_AGREEMENT,
# KEY_AGREEMENT(...) is a key derivation with a key agreement component
'KEY_AGREEMENT': AlgorithmCategory.KEY_DERIVATION,
'JPAKE': AlgorithmCategory.PAKE,
}
for x in BLOCK_MAC_MODES:
CATEGORY_FROM_HEAD[x] = AlgorithmCategory.MAC
for x in BLOCK_CIPHER_MODES:
CATEGORY_FROM_HEAD[x] = AlgorithmCategory.CIPHER
for x in BLOCK_AEAD_MODES:
CATEGORY_FROM_HEAD[x] = AlgorithmCategory.AEAD
def determine_category(self, expr: str, head: str) -> AlgorithmCategory:
"""Return the category of the given algorithm expression.
This function does not attempt to detect invalid inputs.
"""
prefix = head
while prefix:
if prefix in self.CATEGORY_FROM_HEAD:
return self.CATEGORY_FROM_HEAD[prefix]
if re.match(r'.*[0-9]\Z', prefix):
prefix = re.sub(r'_*[0-9]+\Z', r'', prefix)
else:
prefix = re.sub(r'_*[^_]*\Z', r'', prefix)
raise AlgorithmNotRecognized(expr)
@staticmethod
def determine_wildcard(expr) -> bool:
"""Whether the given algorithm expression is a wildcard.
This function does not attempt to detect invalid inputs.
"""
if re.search(r'\bPSA_ALG_ANY_HASH\b', expr):
return True
if re.search(r'_AT_LEAST_', expr):
return True
return False
def __init__(self, expr: str) -> None:
"""Analyze an algorithm value.
The algorithm must be expressed as a C expression containing only
calls to PSA algorithm constructor macros and numeric literals.
This class is only programmed to handle valid expressions. Invalid
expressions may result in exceptions or in nonsensical results.
"""
self.expression = re.sub(r'\s+', r'', expr)
self.base_expression = self.determine_base(self.expression)
self.head = self.determine_head(self.base_expression)
self.category = self.determine_category(self.base_expression, self.head)
self.is_wildcard = self.determine_wildcard(self.expression)
def get_key_agreement_derivation(self) -> Optional[str]:
"""For a combined key agreement and key derivation algorithm, get the derivation part.
For anything else, return None.
"""
if self.category != AlgorithmCategory.KEY_AGREEMENT:
return None
m = re.match(r'PSA_ALG_KEY_AGREEMENT\(\w+,\s*(.*)\)\Z', self.expression)
if not m:
return None
kdf_alg = m.group(1)
# Assume kdf_alg is either a valid KDF or 0.
if re.match(r'(?:0[Xx])?0+\s*\Z', kdf_alg):
return None
return kdf_alg
KEY_DERIVATIONS_INCOMPATIBLE_WITH_AGREEMENT = frozenset([
'PSA_ALG_TLS12_ECJPAKE_TO_PMS', # secret input in specific format
])
def is_valid_key_agreement_with_derivation(self) -> bool:
"""Whether this is a valid combined key agreement and key derivation algorithm."""
kdf_alg = self.get_key_agreement_derivation()
if kdf_alg is None:
return False
return kdf_alg not in self.KEY_DERIVATIONS_INCOMPATIBLE_WITH_AGREEMENT
def is_invalid_key_agreement_with_derivation(self) -> bool:
"""Whether this is an invalid combined key agreement and key derivation algorithm."""
kdf_alg = self.get_key_agreement_derivation()
if kdf_alg is None:
return False
return kdf_alg in self.KEY_DERIVATIONS_INCOMPATIBLE_WITH_AGREEMENT
def short_expression(self, level: int = 0) -> str:
"""Abbreviate the expression, keeping it human-readable.
See `crypto_knowledge.short_expression`.
"""
return short_expression(self.expression, level=level)
HASH_LENGTH = {
'PSA_ALG_MD5': 16,
'PSA_ALG_SHA_1': 20,
}
HASH_LENGTH_BITS_RE = re.compile(r'([0-9]+)\Z')
@classmethod
def hash_length(cls, alg: str) -> int:
"""The length of the given hash algorithm, in bytes."""
if alg in cls.HASH_LENGTH:
return cls.HASH_LENGTH[alg]
m = cls.HASH_LENGTH_BITS_RE.search(alg)
if m:
return int(m.group(1)) // 8
raise ValueError('Unknown hash length for ' + alg)
PERMITTED_TAG_LENGTHS = {
'PSA_ALG_CCM': frozenset([4, 6, 8, 10, 12, 14, 16]),
'PSA_ALG_CHACHA20_POLY1305': frozenset([16]),
'PSA_ALG_GCM': frozenset([4, 8, 12, 13, 14, 15, 16]),
}
MAC_LENGTH = {
'PSA_ALG_CBC_MAC': 16, # actually the block cipher length
'PSA_ALG_CMAC': 16, # actually the block cipher length
}
HMAC_RE = re.compile(r'PSA_ALG_HMAC\((.*)\)\Z')
@classmethod
def permitted_truncations(cls, base: str) -> FrozenSet[int]:
"""Permitted output lengths for the given MAC or AEAD base algorithm.
For a MAC algorithm, this is the set of truncation lengths that
Mbed TLS supports.
For an AEAD algorithm, this is the set of truncation lengths that
are permitted by the algorithm specification.
"""
if base in cls.PERMITTED_TAG_LENGTHS:
return cls.PERMITTED_TAG_LENGTHS[base]
max_length = cls.MAC_LENGTH.get(base, None)
if max_length is None:
m = cls.HMAC_RE.match(base)
if m:
max_length = cls.hash_length(m.group(1))
if max_length is None:
raise ValueError('Unknown permitted lengths for ' + base)
return frozenset(range(4, max_length + 1))
TRUNCATED_ALG_RE = re.compile(
r'(?P<face>PSA_ALG_(?:AEAD_WITH_SHORTENED_TAG|TRUNCATED_MAC))'
r'\((?P<base>.*),'
r'(?P<length>0[Xx][0-9A-Fa-f]+|[1-9][0-9]*|0[0-7]*)[LUlu]*\)\Z')
def is_invalid_truncation(self) -> bool:
"""False for a MAC or AEAD algorithm truncated to an invalid length.
True for a MAC or AEAD algorithm truncated to a valid length or to
a length that cannot be determined. True for anything other than
a truncated MAC or AEAD.
"""
m = self.TRUNCATED_ALG_RE.match(self.expression)
if m:
base = m.group('base')
to_length = int(m.group('length'), 0)
permitted_lengths = self.permitted_truncations(base)
if to_length not in permitted_lengths:
return True
return False
def is_valid_for_operation(self) -> bool:
"""Whether this algorithm construction is valid for an operation.
This function assumes that the algorithm is constructed in a
"grammatically" correct way, and only rejects semantically invalid
combinations.
"""
if self.is_wildcard:
return False
if self.is_invalid_truncation():
return False
return True
def can_do(self, category: AlgorithmCategory) -> bool:
"""Whether this algorithm can perform operations in the given category.
"""
if category == self.category:
return True
if category == AlgorithmCategory.KEY_DERIVATION and \
self.is_valid_key_agreement_with_derivation():
return True
return False
def usage_flags(self, public: bool = False) -> List[str]:
"""The list of usage flags describing operations that can perform this algorithm.
If public is true, only return public-key operations, not private-key operations.
"""
if self.category == AlgorithmCategory.HASH:
flags = []
elif self.category == AlgorithmCategory.MAC:
flags = ['SIGN_HASH', 'SIGN_MESSAGE',
'VERIFY_HASH', 'VERIFY_MESSAGE']
elif self.category == AlgorithmCategory.CIPHER or \
self.category == AlgorithmCategory.AEAD:
flags = ['DECRYPT', 'ENCRYPT']
elif self.category == AlgorithmCategory.SIGN:
flags = ['VERIFY_HASH', 'VERIFY_MESSAGE']
if not public:
flags += ['SIGN_HASH', 'SIGN_MESSAGE']
elif self.category == AlgorithmCategory.ASYMMETRIC_ENCRYPTION:
flags = ['ENCRYPT']
if not public:
flags += ['DECRYPT']
elif self.category == AlgorithmCategory.KEY_DERIVATION or \
self.category == AlgorithmCategory.KEY_AGREEMENT:
flags = ['DERIVE']
else:
raise AlgorithmNotRecognized(self.expression)
return ['PSA_KEY_USAGE_' + flag for flag in flags]

View file

@ -0,0 +1,875 @@
"""Framework classes for generation of ecp test cases."""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
from typing import List
from . import test_data_generation
from . import bignum_common
class EcpTarget(test_data_generation.BaseTarget):
#pylint: disable=abstract-method, too-few-public-methods
"""Target for ecp test case generation."""
target_basename = 'test_suite_ecp.generated'
class EcpP192R1Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP P192 fast reduction."""
symbol = "-"
test_function = "ecp_mod_p_generic_raw"
test_name = "ecp_mod_p192_raw"
input_style = "fixed"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_SECP192R1_ENABLED",
"MBEDTLS_ECP_NIST_OPTIM"]
moduli = ["fffffffffffffffffffffffffffffffeffffffffffffffff"] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
"fffffffffffffffffffffffffffffffefffffffffffffffe",
# Modulus + 1
"ffffffffffffffffffffffffffffffff0000000000000000",
# 2^192 - 1
"ffffffffffffffffffffffffffffffffffffffffffffffff",
# Maximum canonical P192 multiplication result
("fffffffffffffffffffffffffffffffdfffffffffffffffc"
"000000000000000100000000000000040000000000000004"),
# Generate an overflow during reduction
("00000000000000000000000000000001ffffffffffffffff"
"ffffffffffffffffffffffffffffffff0000000000000000"),
# Generate an overflow during carry reduction
("ffffffffffffffff00000000000000010000000000000000"
"fffffffffffffffeffffffffffffffff0000000000000000"),
# First 8 number generated by random.getrandbits(384) - seed(2,2)
("cf1822ffbc6887782b491044d5e341245c6e433715ba2bdd"
"177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"),
("ffed9235288bc781ae66267594c9c9500925e4749b575bd1"
"3653f8dd9b1f282e4067c3584ee207f8da94e3e8ab73738f"),
("ef8acd128b4f2fc15f3f57ebf30b94fa82523e86feac7eb7"
"dc38f519b91751dacdbd47d364be8049a372db8f6e405d93"),
("e8624fab5186ee32ee8d7ee9770348a05d300cb90706a045"
"defc044a09325626e6b58de744ab6cce80877b6f71e1f6d2"),
("2d3d854e061b90303b08c6e33c7295782d6c797f8f7d9b78"
"2a1be9cd8697bbd0e2520e33e44c50556c71c4a66148a86f"),
("fec3f6b32e8d4b8a8f54f8ceacaab39e83844b40ffa9b9f1"
"5c14bc4a829e07b0829a48d422fe99a22c70501e533c9135"),
("97eeab64ca2ce6bc5d3fd983c34c769fe89204e2e8168561"
"867e5e15bc01bfce6a27e0dfcbf8754472154e76e4c11ab2"),
("bd143fa9b714210c665d7435c1066932f4767f26294365b2"
"721dea3bf63f23d0dbe53fcafb2147df5ca495fa5a91c89b"),
# Next 2 number generated by random.getrandbits(192)
"47733e847d718d733ff98ff387c56473a7a83ee0761ebfd2",
"cbd4d3e2d4dec9ef83f0be4e80371eb97f81375eecc1cb63"
]
@property
def arg_a(self) -> str:
return super().format_arg('{:x}'.format(self.int_a)).zfill(2 * self.hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self)-> List[str]:
args = super().arguments()
return ["MBEDTLS_ECP_DP_SECP192R1"] + args
class EcpP224R1Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP P224 fast reduction."""
symbol = "-"
test_function = "ecp_mod_p_generic_raw"
test_name = "ecp_mod_p224_raw"
input_style = "arch_split"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_SECP224R1_ENABLED",
"MBEDTLS_ECP_NIST_OPTIM"]
moduli = ["ffffffffffffffffffffffffffffffff000000000000000000000001"] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
"ffffffffffffffffffffffffffffffff000000000000000000000000",
# Modulus + 1
"ffffffffffffffffffffffffffffffff000000000000000000000002",
# 2^224 - 1
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffff",
# Maximum canonical P224 multiplication result
("fffffffffffffffffffffffffffffffe000000000000000000000000"
"00000001000000000000000000000000000000000000000000000000"),
# Generate an overflow during reduction
("00000000000000000000000000010000000070000000002000001000"
"ffffffffffff9fffffffffe00000efff000070000000002000001003"),
# Generate an underflow during reduction
("00000001000000000000000000000000000000000000000000000000"
"00000000000dc0000000000000000001000000010000000100000003"),
# First 8 number generated by random.getrandbits(448) - seed(2,2)
("da94e3e8ab73738fcf1822ffbc6887782b491044d5e341245c6e4337"
"15ba2bdd177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"),
("cdbd47d364be8049a372db8f6e405d93ffed9235288bc781ae662675"
"94c9c9500925e4749b575bd13653f8dd9b1f282e4067c3584ee207f8"),
("defc044a09325626e6b58de744ab6cce80877b6f71e1f6d2ef8acd12"
"8b4f2fc15f3f57ebf30b94fa82523e86feac7eb7dc38f519b91751da"),
("2d6c797f8f7d9b782a1be9cd8697bbd0e2520e33e44c50556c71c4a6"
"6148a86fe8624fab5186ee32ee8d7ee9770348a05d300cb90706a045"),
("8f54f8ceacaab39e83844b40ffa9b9f15c14bc4a829e07b0829a48d4"
"22fe99a22c70501e533c91352d3d854e061b90303b08c6e33c729578"),
("97eeab64ca2ce6bc5d3fd983c34c769fe89204e2e8168561867e5e15"
"bc01bfce6a27e0dfcbf8754472154e76e4c11ab2fec3f6b32e8d4b8a"),
("a7a83ee0761ebfd2bd143fa9b714210c665d7435c1066932f4767f26"
"294365b2721dea3bf63f23d0dbe53fcafb2147df5ca495fa5a91c89b"),
("74667bffe202849da9643a295a9ac6decbd4d3e2d4dec9ef83f0be4e"
"80371eb97f81375eecc1cb6347733e847d718d733ff98ff387c56473"),
# Next 2 number generated by random.getrandbits(224)
"eb9ac688b9d39cca91551e8259cc60b17604e4b4e73695c3e652c71a",
"f0caeef038c89b38a8acb5137c9260dc74e088a9b9492f258ebdbfe3"
]
@property
def arg_a(self) -> str:
limbs = 2 * bignum_common.bits_to_limbs(224, self.bits_in_limb)
hex_digits = bignum_common.hex_digits_for_limb(limbs, self.bits_in_limb)
return super().format_arg('{:x}'.format(self.int_a)).zfill(hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self)-> List[str]:
args = super().arguments()
return ["MBEDTLS_ECP_DP_SECP224R1"] + args
class EcpP256R1Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP P256 fast reduction."""
symbol = "-"
test_function = "ecp_mod_p_generic_raw"
test_name = "ecp_mod_p256_raw"
input_style = "fixed"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_SECP256R1_ENABLED",
"MBEDTLS_ECP_NIST_OPTIM"]
moduli = ["ffffffff00000001000000000000000000000000ffffffffffffffffffffffff"] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
"ffffffff00000001000000000000000000000000fffffffffffffffffffffffe",
# Modulus + 1
"ffffffff00000001000000000000000000000001000000000000000000000000",
# 2^256 - 1
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff",
# Maximum canonical P256 multiplication result
("fffffffe00000002fffffffe0000000100000001fffffffe00000001fffffffc"
"00000003fffffffcfffffffffffffffffffffffc000000000000000000000004"),
# Generate an overflow during reduction
("0000000000000000000000010000000000000000000000000000000000000000"
"00000000000000000000000000000000000000000000000000000000ffffffff"),
# Generate an underflow during reduction
("0000000000000000000000000000000000000000000000000000000000000010"
"ffffffff00000000000000000000000000000000000000000000000000000000"),
# Generate an overflow during carry reduction
("aaaaaaaa00000000000000000000000000000000000000000000000000000000"
"00000000000000000000000000000000aaaaaaacaaaaaaaaaaaaaaaa00000000"),
# Generate an underflow during carry reduction
("000000000000000000000001ffffffff00000000000000000000000000000000"
"0000000000000000000000000000000000000002000000020000000100000002"),
# First 8 number generated by random.getrandbits(512) - seed(2,2)
("4067c3584ee207f8da94e3e8ab73738fcf1822ffbc6887782b491044d5e34124"
"5c6e433715ba2bdd177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"),
("82523e86feac7eb7dc38f519b91751dacdbd47d364be8049a372db8f6e405d93"
"ffed9235288bc781ae66267594c9c9500925e4749b575bd13653f8dd9b1f282e"),
("e8624fab5186ee32ee8d7ee9770348a05d300cb90706a045defc044a09325626"
"e6b58de744ab6cce80877b6f71e1f6d2ef8acd128b4f2fc15f3f57ebf30b94fa"),
("829a48d422fe99a22c70501e533c91352d3d854e061b90303b08c6e33c729578"
"2d6c797f8f7d9b782a1be9cd8697bbd0e2520e33e44c50556c71c4a66148a86f"),
("e89204e2e8168561867e5e15bc01bfce6a27e0dfcbf8754472154e76e4c11ab2"
"fec3f6b32e8d4b8a8f54f8ceacaab39e83844b40ffa9b9f15c14bc4a829e07b0"),
("bd143fa9b714210c665d7435c1066932f4767f26294365b2721dea3bf63f23d0"
"dbe53fcafb2147df5ca495fa5a91c89b97eeab64ca2ce6bc5d3fd983c34c769f"),
("74667bffe202849da9643a295a9ac6decbd4d3e2d4dec9ef83f0be4e80371eb9"
"7f81375eecc1cb6347733e847d718d733ff98ff387c56473a7a83ee0761ebfd2"),
("d08f1bb2531d6460f0caeef038c89b38a8acb5137c9260dc74e088a9b9492f25"
"8ebdbfe3eb9ac688b9d39cca91551e8259cc60b17604e4b4e73695c3e652c71a"),
# Next 2 number generated by random.getrandbits(256)
"c5e2486c44a4a8f69dc8db48e86ec9c6e06f291b2a838af8d5c44a4eb3172062",
"d4c0dca8b4c9e755cc9c3adcf515a8234da4daeb4f3f87777ad1f45ae9500ec9"
]
@property
def arg_a(self) -> str:
return super().format_arg('{:x}'.format(self.int_a)).zfill(2 * self.hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self)-> List[str]:
args = super().arguments()
return ["MBEDTLS_ECP_DP_SECP256R1"] + args
class EcpP384R1Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP P384 fast reduction."""
test_function = "ecp_mod_p_generic_raw"
test_name = "ecp_mod_p384_raw"
input_style = "fixed"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_SECP384R1_ENABLED",
"MBEDTLS_ECP_NIST_OPTIM"]
moduli = [("ffffffffffffffffffffffffffffffffffffffffffffffff"
"fffffffffffffffeffffffff0000000000000000ffffffff")
] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
("ffffffffffffffffffffffffffffffffffffffffffffffff"
"fffffffffffffffeffffffff0000000000000000fffffffe"),
# Modulus + 1
("ffffffffffffffffffffffffffffffffffffffffffffffff"
"fffffffffffffffeffffffff000000000000000100000000"),
# 2^384 - 1
("ffffffffffffffffffffffffffffffffffffffffffffffff"
"ffffffffffffffffffffffffffffffffffffffffffffffff"),
# Maximum canonical P384 multiplication result
("ffffffffffffffffffffffffffffffffffffffffffffffff"
"fffffffffffffffdfffffffe0000000000000001fffffffc"
"000000000000000000000000000000010000000200000000"
"fffffffe000000020000000400000000fffffffc00000004"),
# Testing with overflow in A(12) + A(21) + A(20);
("497811378624857a2c2af60d70583376545484cfae5c812f"
"e2999fc1abb51d18b559e8ca3b50aaf263fdf8f24bdfb98f"
"ffffffff20e65bf9099e4e73a5e8b517cf4fbeb8fd1750fd"
"ae6d43f2e53f82d5ffffffffffffffffcc6f1e06111c62e0"),
# Testing with underflow in A(13) + A(22) + A(23) - A(12) - A(20);
("dfdd25e96777406b3c04b8c7b406f5fcf287e1e576003a09"
"2852a6fbe517f2712b68abef41dbd35183a0614fb7222606"
"ffffffff84396eee542f18a9189d94396c784059c17a9f18"
"f807214ef32f2f10ffffffff8a77fac20000000000000000"),
# Testing with overflow in A(23) + A(20) + A(19) - A(22);
("783753f8a5afba6c1862eead1deb2fcdd907272be3ffd185"
"42b24a71ee8b26cab0aa33513610ff973042bbe1637cc9fc"
"99ad36c7f703514572cf4f5c3044469a8f5be6312c19e5d3"
"f8fc1ac6ffffffffffffffff8c86252400000000ffffffff"),
# Testing with underflow in A(23) + A(20) + A(19) - A(22);
("65e1d2362fce922663b7fd517586e88842a9b4bd092e93e6"
"251c9c69f278cbf8285d99ae3b53da5ba36e56701e2b17c2"
"25f1239556c5f00117fa140218b46ebd8e34f50d0018701f"
"a8a0a5cc00000000000000004410bcb4ffffffff00000000"),
# Testing the second round of carry reduction
("000000000000000000000000ffffffffffffffffffffffff"
"ffffffffffffffffffffffffffffffff0000000000000000"
"0000000000000000ffffffff000000000000000000000001"
"00000000000000000000000000000000ffffffff00000001"),
# First 8 number generated by random.getrandbits(768) - seed(2,2)
("ffed9235288bc781ae66267594c9c9500925e4749b575bd1"
"3653f8dd9b1f282e4067c3584ee207f8da94e3e8ab73738f"
"cf1822ffbc6887782b491044d5e341245c6e433715ba2bdd"
"177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"),
("e8624fab5186ee32ee8d7ee9770348a05d300cb90706a045"
"defc044a09325626e6b58de744ab6cce80877b6f71e1f6d2"
"ef8acd128b4f2fc15f3f57ebf30b94fa82523e86feac7eb7"
"dc38f519b91751dacdbd47d364be8049a372db8f6e405d93"),
("fec3f6b32e8d4b8a8f54f8ceacaab39e83844b40ffa9b9f1"
"5c14bc4a829e07b0829a48d422fe99a22c70501e533c9135"
"2d3d854e061b90303b08c6e33c7295782d6c797f8f7d9b78"
"2a1be9cd8697bbd0e2520e33e44c50556c71c4a66148a86f"),
("bd143fa9b714210c665d7435c1066932f4767f26294365b2"
"721dea3bf63f23d0dbe53fcafb2147df5ca495fa5a91c89b"
"97eeab64ca2ce6bc5d3fd983c34c769fe89204e2e8168561"
"867e5e15bc01bfce6a27e0dfcbf8754472154e76e4c11ab2"),
("8ebdbfe3eb9ac688b9d39cca91551e8259cc60b17604e4b4"
"e73695c3e652c71a74667bffe202849da9643a295a9ac6de"
"cbd4d3e2d4dec9ef83f0be4e80371eb97f81375eecc1cb63"
"47733e847d718d733ff98ff387c56473a7a83ee0761ebfd2"),
("d4c0dca8b4c9e755cc9c3adcf515a8234da4daeb4f3f8777"
"7ad1f45ae9500ec9c5e2486c44a4a8f69dc8db48e86ec9c6"
"e06f291b2a838af8d5c44a4eb3172062d08f1bb2531d6460"
"f0caeef038c89b38a8acb5137c9260dc74e088a9b9492f25"),
("0227eeb7b9d7d01f5769da05d205bbfcc8c69069134bccd3"
"e1cf4f589f8e4ce0af29d115ef24bd625dd961e6830b54fa"
"7d28f93435339774bb1e386c4fd5079e681b8f5896838b76"
"9da59b74a6c3181c81e220df848b1df78feb994a81167346"),
("d322a7353ead4efe440e2b4fda9c025a22f1a83185b98f5f"
"c11e60de1b343f52ea748db9e020307aaeb6db2c3a038a70"
"9779ac1f45e9dd320c855fdfa7251af0930cdbd30f0ad2a8"
"1b2d19a2beaa14a7ff3fe32a30ffc4eed0a7bd04e85bfcdd"),
# Next 2 number generated by random.getrandbits(384)
("5c3747465cc36c270e8a35b10828d569c268a20eb78ac332"
"e5e138e26c4454b90f756132e16dce72f18e859835e1f291"),
("eb2b5693babb7fbb0a76c196067cfdcb11457d9cf45e2fa0"
"1d7f4275153924800600571fac3a5b263fdf57cd2c006497")
]
@property
def arg_a(self) -> str:
return super().format_arg('{:x}'.format(self.int_a)).zfill(2 * self.hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self)-> List[str]:
args = super().arguments()
return ["MBEDTLS_ECP_DP_SECP384R1"] + args
class EcpP521R1Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP P521 fast reduction."""
test_function = "ecp_mod_p_generic_raw"
test_name = "ecp_mod_p521_raw"
input_style = "arch_split"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_SECP521R1_ENABLED",
"MBEDTLS_ECP_NIST_OPTIM"]
moduli = [("01ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff")
] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
("01ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"
"fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffe"),
# Modulus + 1
("020000000000000000000000000000000000000000000000000000000000000000"
"000000000000000000000000000000000000000000000000000000000000000000"),
# Maximum canonical P521 multiplication result
("0003ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"
"fffff800"
"0000000000000000000000000000000000000000000000000000000000000000"
"0000000000000000000000000000000000000000000000000000000000000004"),
# Test case for overflow during addition
("0001efffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"
"000001ef"
"0000000000000000000000000000000000000000000000000000000000000000"
"000000000000000000000000000000000000000000000000000000000f000000"),
# First 8 number generated by random.getrandbits(1042) - seed(2,2)
("0003cc2e82523e86feac7eb7dc38f519b91751dacdbd47d364be8049a372db8f"
"6e405d93ffed9235288bc781ae66267594c9c9500925e4749b575bd13653f8dd"
"9b1f282e"
"4067c3584ee207f8da94e3e8ab73738fcf1822ffbc6887782b491044d5e34124"
"5c6e433715ba2bdd177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"),
("00017052829e07b0829a48d422fe99a22c70501e533c91352d3d854e061b9030"
"3b08c6e33c7295782d6c797f8f7d9b782a1be9cd8697bbd0e2520e33e44c5055"
"6c71c4a6"
"6148a86fe8624fab5186ee32ee8d7ee9770348a05d300cb90706a045defc044a"
"09325626e6b58de744ab6cce80877b6f71e1f6d2ef8acd128b4f2fc15f3f57eb"),
("00021f15a7a83ee0761ebfd2bd143fa9b714210c665d7435c1066932f4767f26"
"294365b2721dea3bf63f23d0dbe53fcafb2147df5ca495fa5a91c89b97eeab64"
"ca2ce6bc"
"5d3fd983c34c769fe89204e2e8168561867e5e15bc01bfce6a27e0dfcbf87544"
"72154e76e4c11ab2fec3f6b32e8d4b8a8f54f8ceacaab39e83844b40ffa9b9f1"),
("000381bc2a838af8d5c44a4eb3172062d08f1bb2531d6460f0caeef038c89b38"
"a8acb5137c9260dc74e088a9b9492f258ebdbfe3eb9ac688b9d39cca91551e82"
"59cc60b1"
"7604e4b4e73695c3e652c71a74667bffe202849da9643a295a9ac6decbd4d3e2"
"d4dec9ef83f0be4e80371eb97f81375eecc1cb6347733e847d718d733ff98ff3"),
("00034816c8c69069134bccd3e1cf4f589f8e4ce0af29d115ef24bd625dd961e6"
"830b54fa7d28f93435339774bb1e386c4fd5079e681b8f5896838b769da59b74"
"a6c3181c"
"81e220df848b1df78feb994a81167346d4c0dca8b4c9e755cc9c3adcf515a823"
"4da4daeb4f3f87777ad1f45ae9500ec9c5e2486c44a4a8f69dc8db48e86ec9c6"),
("000397846c4454b90f756132e16dce72f18e859835e1f291d322a7353ead4efe"
"440e2b4fda9c025a22f1a83185b98f5fc11e60de1b343f52ea748db9e020307a"
"aeb6db2c"
"3a038a709779ac1f45e9dd320c855fdfa7251af0930cdbd30f0ad2a81b2d19a2"
"beaa14a7ff3fe32a30ffc4eed0a7bd04e85bfcdd0227eeb7b9d7d01f5769da05"),
("00002c3296e6bc4d62b47204007ee4fab105d83e85e951862f0981aebc1b00d9"
"2838e766ef9b6bf2d037fe2e20b6a8464174e75a5f834da70569c018eb2b5693"
"babb7fbb"
"0a76c196067cfdcb11457d9cf45e2fa01d7f4275153924800600571fac3a5b26"
"3fdf57cd2c0064975c3747465cc36c270e8a35b10828d569c268a20eb78ac332"),
("00009d23b4917fc09f20dbb0dcc93f0e66dfe717c17313394391b6e2e6eacb0f"
"0bb7be72bd6d25009aeb7fa0c4169b148d2f527e72daf0a54ef25c0707e33868"
"7d1f7157"
"5653a45c49390aa51cf5192bbf67da14be11d56ba0b4a2969d8055a9f03f2d71"
"581d8e830112ff0f0948eccaf8877acf26c377c13f719726fd70bddacb4deeec"),
# Next 2 number generated by random.getrandbits(521)
("12b84ae65e920a63ac1f2b64df6dff07870c9d531ae72a47403063238da1a1fe"
"3f9d6a179fa50f96cd4aff9261aa92c0e6f17ec940639bc2ccdf572df00790813e3"),
("166049dd332a73fa0b26b75196cf87eb8a09b27ec714307c68c425424a1574f1"
"eedf5b0f16cdfdb839424d201e653f53d6883ca1c107ca6e706649889c0c7f38608")
]
@property
def arg_a(self) -> str:
# Number of limbs: 2 * N
return super().format_arg('{:x}'.format(self.int_a)).zfill(2 * self.hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self)-> List[str]:
args = super().arguments()
return ["MBEDTLS_ECP_DP_SECP521R1"] + args
class EcpP192K1Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP P192K1 fast reduction."""
symbol = "-"
test_function = "ecp_mod_p_generic_raw"
test_name = "ecp_mod_p192k1_raw"
input_style = "fixed"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_SECP192K1_ENABLED"]
moduli = ["fffffffffffffffffffffffffffffffffffffffeffffee37"] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
"fffffffffffffffffffffffffffffffffffffffeffffee36",
# Modulus + 1
"fffffffffffffffffffffffffffffffffffffffeffffee38",
# 2^192 - 1
"ffffffffffffffffffffffffffffffffffffffffffffffff",
# Maximum canonical P192K1 multiplication result
("fffffffffffffffffffffffffffffffffffffffdffffdc6c"
"0000000000000000000000000000000100002394013c7364"),
# Test case for overflow during addition
("00000007ffff71b809e27dd832cfd5e04d9d2dbb9f8da217"
"0000000000000000000000000000000000000000520834f0"),
# First 8 number generated by random.getrandbits(384) - seed(2,2)
("cf1822ffbc6887782b491044d5e341245c6e433715ba2bdd"
"177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"),
("ffed9235288bc781ae66267594c9c9500925e4749b575bd1"
"3653f8dd9b1f282e4067c3584ee207f8da94e3e8ab73738f"),
("ef8acd128b4f2fc15f3f57ebf30b94fa82523e86feac7eb7"
"dc38f519b91751dacdbd47d364be8049a372db8f6e405d93"),
("e8624fab5186ee32ee8d7ee9770348a05d300cb90706a045"
"defc044a09325626e6b58de744ab6cce80877b6f71e1f6d2"),
("2d3d854e061b90303b08c6e33c7295782d6c797f8f7d9b78"
"2a1be9cd8697bbd0e2520e33e44c50556c71c4a66148a86f"),
("fec3f6b32e8d4b8a8f54f8ceacaab39e83844b40ffa9b9f1"
"5c14bc4a829e07b0829a48d422fe99a22c70501e533c9135"),
("97eeab64ca2ce6bc5d3fd983c34c769fe89204e2e8168561"
"867e5e15bc01bfce6a27e0dfcbf8754472154e76e4c11ab2"),
("bd143fa9b714210c665d7435c1066932f4767f26294365b2"
"721dea3bf63f23d0dbe53fcafb2147df5ca495fa5a91c89b"),
# Next 2 number generated by random.getrandbits(192)
"47733e847d718d733ff98ff387c56473a7a83ee0761ebfd2",
"cbd4d3e2d4dec9ef83f0be4e80371eb97f81375eecc1cb63"
]
@property
def arg_a(self) -> str:
return super().format_arg('{:x}'.format(self.int_a)).zfill(2 * self.hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self):
args = super().arguments()
return ["MBEDTLS_ECP_DP_SECP192K1"] + args
class EcpP224K1Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP P224 fast reduction."""
symbol = "-"
test_function = "ecp_mod_p_generic_raw"
test_name = "ecp_mod_p224k1_raw"
input_style = "arch_split"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_SECP224K1_ENABLED"]
moduli = ["fffffffffffffffffffffffffffffffffffffffffffffffeffffe56d"] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
"fffffffffffffffffffffffffffffffffffffffffffffffeffffe56c",
# Modulus + 1
"fffffffffffffffffffffffffffffffffffffffffffffffeffffe56e",
# 2^224 - 1
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffff",
# Maximum canonical P224K1 multiplication result
("fffffffffffffffffffffffffffffffffffffffffffffffdffffcad8"
"00000000000000000000000000000000000000010000352802c26590"),
# Test case for overflow during addition
("0000007ffff2b68161180fd8cd92e1a109be158a19a99b1809db8032"
"0000000000000000000000000000000000000000000000000bf04f49"),
# First 8 number generated by random.getrandbits(448) - seed(2,2)
("da94e3e8ab73738fcf1822ffbc6887782b491044d5e341245c6e4337"
"15ba2bdd177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"),
("cdbd47d364be8049a372db8f6e405d93ffed9235288bc781ae662675"
"94c9c9500925e4749b575bd13653f8dd9b1f282e4067c3584ee207f8"),
("defc044a09325626e6b58de744ab6cce80877b6f71e1f6d2ef8acd12"
"8b4f2fc15f3f57ebf30b94fa82523e86feac7eb7dc38f519b91751da"),
("2d6c797f8f7d9b782a1be9cd8697bbd0e2520e33e44c50556c71c4a6"
"6148a86fe8624fab5186ee32ee8d7ee9770348a05d300cb90706a045"),
("8f54f8ceacaab39e83844b40ffa9b9f15c14bc4a829e07b0829a48d4"
"22fe99a22c70501e533c91352d3d854e061b90303b08c6e33c729578"),
("97eeab64ca2ce6bc5d3fd983c34c769fe89204e2e8168561867e5e15"
"bc01bfce6a27e0dfcbf8754472154e76e4c11ab2fec3f6b32e8d4b8a"),
("a7a83ee0761ebfd2bd143fa9b714210c665d7435c1066932f4767f26"
"294365b2721dea3bf63f23d0dbe53fcafb2147df5ca495fa5a91c89b"),
("74667bffe202849da9643a295a9ac6decbd4d3e2d4dec9ef83f0be4e"
"80371eb97f81375eecc1cb6347733e847d718d733ff98ff387c56473"),
# Next 2 number generated by random.getrandbits(224)
("eb9ac688b9d39cca91551e8259cc60b17604e4b4e73695c3e652c71a"),
("f0caeef038c89b38a8acb5137c9260dc74e088a9b9492f258ebdbfe3"),
]
@property
def arg_a(self) -> str:
limbs = 2 * bignum_common.bits_to_limbs(224, self.bits_in_limb)
hex_digits = bignum_common.hex_digits_for_limb(limbs, self.bits_in_limb)
return super().format_arg('{:x}'.format(self.int_a)).zfill(hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self):
args = super().arguments()
return ["MBEDTLS_ECP_DP_SECP224K1"] + args
class EcpP256K1Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP P256 fast reduction."""
symbol = "-"
test_function = "ecp_mod_p_generic_raw"
test_name = "ecp_mod_p256k1_raw"
input_style = "fixed"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_SECP256K1_ENABLED"]
moduli = ["fffffffffffffffffffffffffffffffffffffffffffffffffffffffefffffc2f"] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
"fffffffffffffffffffffffffffffffffffffffffffffffffffffffefffffc2e",
# Modulus + 1
"fffffffffffffffffffffffffffffffffffffffffffffffffffffffefffffc30",
# 2^256 - 1
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff",
# Maximum canonical P256K1 multiplication result
("fffffffffffffffffffffffffffffffffffffffffffffffffffffffdfffff85c"
"000000000000000000000000000000000000000000000001000007a4000e9844"),
# Test case for overflow during addition
("0000fffffc2f000e90a0c86a0a63234e5ba641f43a7e4aecc4040e67ec850562"
"00000000000000000000000000000000000000000000000000000000585674fd"),
# Test case for overflow during addition
("0000fffffc2f000e90a0c86a0a63234e5ba641f43a7e4aecc4040e67ec850562"
"00000000000000000000000000000000000000000000000000000000585674fd"),
# First 8 number generated by random.getrandbits(512) - seed(2,2)
("4067c3584ee207f8da94e3e8ab73738fcf1822ffbc6887782b491044d5e34124"
"5c6e433715ba2bdd177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"),
("82523e86feac7eb7dc38f519b91751dacdbd47d364be8049a372db8f6e405d93"
"ffed9235288bc781ae66267594c9c9500925e4749b575bd13653f8dd9b1f282e"),
("e8624fab5186ee32ee8d7ee9770348a05d300cb90706a045defc044a09325626"
"e6b58de744ab6cce80877b6f71e1f6d2ef8acd128b4f2fc15f3f57ebf30b94fa"),
("829a48d422fe99a22c70501e533c91352d3d854e061b90303b08c6e33c729578"
"2d6c797f8f7d9b782a1be9cd8697bbd0e2520e33e44c50556c71c4a66148a86f"),
("e89204e2e8168561867e5e15bc01bfce6a27e0dfcbf8754472154e76e4c11ab2"
"fec3f6b32e8d4b8a8f54f8ceacaab39e83844b40ffa9b9f15c14bc4a829e07b0"),
("bd143fa9b714210c665d7435c1066932f4767f26294365b2721dea3bf63f23d0"
"dbe53fcafb2147df5ca495fa5a91c89b97eeab64ca2ce6bc5d3fd983c34c769f"),
("74667bffe202849da9643a295a9ac6decbd4d3e2d4dec9ef83f0be4e80371eb9"
"7f81375eecc1cb6347733e847d718d733ff98ff387c56473a7a83ee0761ebfd2"),
("d08f1bb2531d6460f0caeef038c89b38a8acb5137c9260dc74e088a9b9492f25"
"8ebdbfe3eb9ac688b9d39cca91551e8259cc60b17604e4b4e73695c3e652c71a"),
# Next 2 number generated by random.getrandbits(256)
("c5e2486c44a4a8f69dc8db48e86ec9c6e06f291b2a838af8d5c44a4eb3172062"),
("d4c0dca8b4c9e755cc9c3adcf515a8234da4daeb4f3f87777ad1f45ae9500ec9"),
]
@property
def arg_a(self) -> str:
return super().format_arg('{:x}'.format(self.int_a)).zfill(2 * self.hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self):
args = super().arguments()
return ["MBEDTLS_ECP_DP_SECP256K1"] + args
class EcpP255Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP 25519 fast reduction."""
symbol = "-"
test_function = "ecp_mod_p_generic_raw"
test_name = "mbedtls_ecp_mod_p255_raw"
input_style = "fixed"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_CURVE25519_ENABLED"]
moduli = [("7fffffffffffffffffffffffffffffffffffffffffffffffff"
"ffffffffffffed")] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
("7fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffec"),
# Modulus + 1
("7fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffee"),
# 2^255 - 1
("7fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff"),
# Maximum canonical P255 multiplication result
("3fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffec"
"0000000000000000000000000000000000000000000000000000000000000190"),
# First 8 number generated by random.getrandbits(510) - seed(2,2)
("1019f0d64ee207f8da94e3e8ab73738fcf1822ffbc6887782b491044d5e34124"
"5c6e433715ba2bdd177219d30e7a269fd95bafc8f2a4d27bdcf4bb99f4bea973"),
("20948fa1feac7eb7dc38f519b91751dacdbd47d364be8049a372db8f6e405d93"
"ffed9235288bc781ae66267594c9c9500925e4749b575bd13653f8dd9b1f282e"),
("3a1893ea5186ee32ee8d7ee9770348a05d300cb90706a045defc044a09325626"
"e6b58de744ab6cce80877b6f71e1f6d2ef8acd128b4f2fc15f3f57ebf30b94fa"),
("20a6923522fe99a22c70501e533c91352d3d854e061b90303b08c6e33c729578"
"2d6c797f8f7d9b782a1be9cd8697bbd0e2520e33e44c50556c71c4a66148a86f"),
("3a248138e8168561867e5e15bc01bfce6a27e0dfcbf8754472154e76e4c11ab2"
"fec3f6b32e8d4b8a8f54f8ceacaab39e83844b40ffa9b9f15c14bc4a829e07b0"),
("2f450feab714210c665d7435c1066932f4767f26294365b2721dea3bf63f23d0"
"dbe53fcafb2147df5ca495fa5a91c89b97eeab64ca2ce6bc5d3fd983c34c769f"),
("1d199effe202849da9643a295a9ac6decbd4d3e2d4dec9ef83f0be4e80371eb9"
"7f81375eecc1cb6347733e847d718d733ff98ff387c56473a7a83ee0761ebfd2"),
("3423c6ec531d6460f0caeef038c89b38a8acb5137c9260dc74e088a9b9492f25"
"8ebdbfe3eb9ac688b9d39cca91551e8259cc60b17604e4b4e73695c3e652c71a"),
# Next 2 number generated by random.getrandbits(255)
("62f1243644a4a8f69dc8db48e86ec9c6e06f291b2a838af8d5c44a4eb3172062"),
("6a606e54b4c9e755cc9c3adcf515a8234da4daeb4f3f87777ad1f45ae9500ec9"),
]
@property
def arg_a(self) -> str:
return super().format_arg('{:x}'.format(self.int_a)).zfill(2 * self.hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self)-> List[str]:
args = super().arguments()
return ["MBEDTLS_ECP_DP_CURVE25519"] + args
class EcpP448Raw(bignum_common.ModOperationCommon,
EcpTarget):
"""Test cases for ECP P448 fast reduction."""
symbol = "-"
test_function = "ecp_mod_p_generic_raw"
test_name = "ecp_mod_p448_raw"
input_style = "fixed"
arity = 1
dependencies = ["MBEDTLS_ECP_DP_CURVE448_ENABLED"]
moduli = [("fffffffffffffffffffffffffffffffffffffffffffffffffffffffe"
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffff")] # type: List[str]
input_values = [
"0", "1",
# Modulus - 1
("fffffffffffffffffffffffffffffffffffffffffffffffffffffffe"
"fffffffffffffffffffffffffffffffffffffffffffffffffffffffe"),
# Modulus + 1
("ffffffffffffffffffffffffffffffffffffffffffffffffffffffff"
"00000000000000000000000000000000000000000000000000000000"),
# 2^448 - 1
("ffffffffffffffffffffffffffffffffffffffffffffffffffffffff"
"ffffffffffffffffffffffffffffffffffffffffffffffffffffffff"),
# Maximum canonical P448 multiplication result
("fffffffffffffffffffffffffffffffffffffffffffffffffffffffd"
"fffffffffffffffffffffffffffffffffffffffffffffffffffffffd"
"00000000000000000000000000000000000000000000000000000004"
"00000000000000000000000000000000000000000000000000000004"),
# First 8 number generated by random.getrandbits(896) - seed(2,2)
("74667bffe202849da9643a295a9ac6decbd4d3e2d4dec9ef83f0be4e"
"80371eb97f81375eecc1cb6347733e847d718d733ff98ff387c56473"
"a7a83ee0761ebfd2bd143fa9b714210c665d7435c1066932f4767f26"
"294365b2721dea3bf63f23d0dbe53fcafb2147df5ca495fa5a91c89b"),
("4da4daeb4f3f87777ad1f45ae9500ec9c5e2486c44a4a8f69dc8db48"
"e86ec9c6e06f291b2a838af8d5c44a4eb3172062d08f1bb2531d6460"
"f0caeef038c89b38a8acb5137c9260dc74e088a9b9492f258ebdbfe3"
"eb9ac688b9d39cca91551e8259cc60b17604e4b4e73695c3e652c71a"),
("bc1b00d92838e766ef9b6bf2d037fe2e20b6a8464174e75a5f834da7"
"0569c018eb2b5693babb7fbb0a76c196067cfdcb11457d9cf45e2fa0"
"1d7f4275153924800600571fac3a5b263fdf57cd2c0064975c374746"
"5cc36c270e8a35b10828d569c268a20eb78ac332e5e138e26c4454b9"),
("8d2f527e72daf0a54ef25c0707e338687d1f71575653a45c49390aa5"
"1cf5192bbf67da14be11d56ba0b4a2969d8055a9f03f2d71581d8e83"
"0112ff0f0948eccaf8877acf26c377c13f719726fd70bddacb4deeec"
"0b0c995e96e6bc4d62b47204007ee4fab105d83e85e951862f0981ae"),
("84ae65e920a63ac1f2b64df6dff07870c9d531ae72a47403063238da"
"1a1fe3f9d6a179fa50f96cd4aff9261aa92c0e6f17ec940639bc2ccd"
"f572df00790813e32748dd1db4917fc09f20dbb0dcc93f0e66dfe717"
"c17313394391b6e2e6eacb0f0bb7be72bd6d25009aeb7fa0c4169b14"),
("2bb3b36f29421c4021b7379f0897246a40c270b00e893302aba9e7b8"
"23fc5ad2f58105748ed5d1b7b310b730049dd332a73fa0b26b75196c"
"f87eb8a09b27ec714307c68c425424a1574f1eedf5b0f16cdfdb8394"
"24d201e653f53d6883ca1c107ca6e706649889c0c7f3860895bfa813"),
("af3f5d7841b1256d5c1dc12fb5a1ae519fb8883accda6559caa538a0"
"9fc9370d3a6b86a7975b54a31497024640332b0612d4050771d7b14e"
"b6c004cc3b8367dc3f2bb31efe9934ad0809eae3ef232a32b5459d83"
"fbc46f1aea990e94821d46063b4dbf2ca294523d74115c86188b1044"),
("7430051376e31f5aab63ad02854efa600641b4fa37a47ce41aeffafc"
"3b45402ac02659fe2e87d4150511baeb198ababb1a16daff3da95cd2"
"167b75dfb948f82a8317cba01c75f67e290535d868a24b7f627f2855"
"09167d4126af8090013c3273c02c6b9586b4625b475b51096c4ad652"),
# Corner case which causes maximum overflow
("f4ae65e920a63ac1f2b64df6dff07870c9d531ae72a47403063238da1"
"a1fe3f9d6a179fa50f96cd4aff9261aa92c0e6f17ec940639bc2ccd0B"
"519A16DF59C53E0D49B209200F878F362ACE518D5B8BFCF9CDC725E5E"
"01C06295E8605AF06932B5006D9E556D3F190E8136BF9C643D332"),
# Next 2 number generated by random.getrandbits(448)
("8f54f8ceacaab39e83844b40ffa9b9f15c14bc4a829e07b0829a48d4"
"22fe99a22c70501e533c91352d3d854e061b90303b08c6e33c729578"),
("97eeab64ca2ce6bc5d3fd983c34c769fe89204e2e8168561867e5e15"
"bc01bfce6a27e0dfcbf8754472154e76e4c11ab2fec3f6b32e8d4b8a"),
]
@property
def arg_a(self) -> str:
return super().format_arg('{:x}'.format(self.int_a)).zfill(2 * self.hex_digits)
def result(self) -> List[str]:
result = self.int_a % self.int_n
return [self.format_result(result)]
@property
def is_valid(self) -> bool:
return True
def arguments(self):
args = super().arguments()
return ["MBEDTLS_ECP_DP_CURVE448"] + args

View file

@ -0,0 +1,46 @@
"""Auxiliary functions used for logging module.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import logging
import sys
def configure_logger(
logger: logging.Logger,
log_format="[%(levelname)s]: %(message)s",
split_level=logging.WARNING
) -> None:
"""
Configure the logging.Logger instance so that:
- Format is set to any log_format.
Default: "[%(levelname)s]: %(message)s"
- loglevel >= split_level are printed to stderr.
- loglevel < split_level are printed to stdout.
Default: logging.WARNING
"""
class MaxLevelFilter(logging.Filter):
# pylint: disable=too-few-public-methods
def __init__(self, max_level, name=''):
super().__init__(name)
self.max_level = max_level
def filter(self, record: logging.LogRecord) -> bool:
return record.levelno <= self.max_level
log_formatter = logging.Formatter(log_format)
# set loglevel >= split_level to be printed to stderr
stderr_hdlr = logging.StreamHandler(sys.stderr)
stderr_hdlr.setLevel(split_level)
stderr_hdlr.setFormatter(log_formatter)
# set loglevel < split_level to be printed to stdout
stdout_hdlr = logging.StreamHandler(sys.stdout)
stdout_hdlr.addFilter(MaxLevelFilter(split_level - 1))
stdout_hdlr.setFormatter(log_formatter)
logger.addHandler(stderr_hdlr)
logger.addHandler(stdout_hdlr)

View file

@ -0,0 +1,539 @@
"""Collect macro definitions from header files.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import itertools
import re
from typing import Dict, IO, Iterable, Iterator, List, Optional, Pattern, Set, Tuple, Union
class ReadFileLineException(Exception):
def __init__(self, filename: str, line_number: Union[int, str]) -> None:
message = 'in {} at {}'.format(filename, line_number)
super(ReadFileLineException, self).__init__(message)
self.filename = filename
self.line_number = line_number
class read_file_lines:
# Dear Pylint, conventionally, a context manager class name is lowercase.
# pylint: disable=invalid-name,too-few-public-methods
"""Context manager to read a text file line by line.
```
with read_file_lines(filename) as lines:
for line in lines:
process(line)
```
is equivalent to
```
with open(filename, 'r') as input_file:
for line in input_file:
process(line)
```
except that if process(line) raises an exception, then the read_file_lines
snippet annotates the exception with the file name and line number.
"""
def __init__(self, filename: str, binary: bool = False) -> None:
self.filename = filename
self.file = None #type: Optional[IO[str]]
self.line_number = 'entry' #type: Union[int, str]
self.generator = None #type: Optional[Iterable[Tuple[int, str]]]
self.binary = binary
def __enter__(self) -> 'read_file_lines':
self.file = open(self.filename, 'rb' if self.binary else 'r')
self.generator = enumerate(self.file)
return self
def __iter__(self) -> Iterator[str]:
assert self.generator is not None
for line_number, content in self.generator:
self.line_number = line_number
yield content
self.line_number = 'exit'
def __exit__(self, exc_type, exc_value, exc_traceback) -> None:
if self.file is not None:
self.file.close()
if exc_type is not None:
raise ReadFileLineException(self.filename, self.line_number) \
from exc_value
class PSAMacroEnumerator:
"""Information about constructors of various PSA Crypto types.
This includes macro names as well as information about their arguments
when applicable.
This class only provides ways to enumerate expressions that evaluate to
values of the covered types. Derived classes are expected to populate
the set of known constructors of each kind, as well as populate
`self.arguments_for` for arguments that are not of a kind that is
enumerated here.
"""
#pylint: disable=too-many-instance-attributes
def __init__(self) -> None:
"""Set up an empty set of known constructor macros.
"""
self.statuses = set() #type: Set[str]
self.lifetimes = set() #type: Set[str]
self.locations = set() #type: Set[str]
self.persistence_levels = set() #type: Set[str]
self.algorithms = set() #type: Set[str]
self.ecc_curves = set() #type: Set[str]
self.dh_groups = set() #type: Set[str]
self.key_types = set() #type: Set[str]
self.key_usage_flags = set() #type: Set[str]
self.hash_algorithms = set() #type: Set[str]
self.mac_algorithms = set() #type: Set[str]
self.ka_algorithms = set() #type: Set[str]
self.kdf_algorithms = set() #type: Set[str]
self.pake_algorithms = set() #type: Set[str]
self.aead_algorithms = set() #type: Set[str]
self.sign_algorithms = set() #type: Set[str]
# macro name -> list of argument names
self.argspecs = {} #type: Dict[str, List[str]]
# argument name -> list of values
self.arguments_for = {
'mac_length': [],
'min_mac_length': [],
'tag_length': [],
'min_tag_length': [],
} #type: Dict[str, List[str]]
# Whether to include intermediate macros in enumerations. Intermediate
# macros serve as category headers and are not valid values of their
# type. See `is_internal_name`.
# Always false in this class, may be set to true in derived classes.
self.include_intermediate = False
def is_internal_name(self, name: str) -> bool:
"""Whether this is an internal macro. Internal macros will be skipped."""
if not self.include_intermediate:
if name.endswith('_BASE') or name.endswith('_NONE'):
return True
if '_CATEGORY_' in name:
return True
return name.endswith('_FLAG') or name.endswith('_MASK')
def gather_arguments(self) -> None:
"""Populate the list of values for macro arguments.
Call this after parsing all the inputs.
"""
self.arguments_for['hash_alg'] = sorted(self.hash_algorithms)
self.arguments_for['mac_alg'] = sorted(self.mac_algorithms)
self.arguments_for['ka_alg'] = sorted(self.ka_algorithms)
self.arguments_for['kdf_alg'] = sorted(self.kdf_algorithms)
self.arguments_for['aead_alg'] = sorted(self.aead_algorithms)
self.arguments_for['sign_alg'] = sorted(self.sign_algorithms)
self.arguments_for['curve'] = sorted(self.ecc_curves)
self.arguments_for['group'] = sorted(self.dh_groups)
self.arguments_for['persistence'] = sorted(self.persistence_levels)
self.arguments_for['location'] = sorted(self.locations)
self.arguments_for['lifetime'] = sorted(self.lifetimes)
@staticmethod
def _format_arguments(name: str, arguments: Iterable[str]) -> str:
"""Format a macro call with arguments.
The resulting format is consistent with
`InputsForTest.normalize_argument`.
"""
return name + '(' + ', '.join(arguments) + ')'
_argument_split_re = re.compile(r' *, *')
@classmethod
def _argument_split(cls, arguments: str) -> List[str]:
return re.split(cls._argument_split_re, arguments)
def distribute_arguments(self, name: str) -> Iterator[str]:
"""Generate macro calls with each tested argument set.
If name is a macro without arguments, just yield "name".
If name is a macro with arguments, yield a series of
"name(arg1,...,argN)" where each argument takes each possible
value at least once.
"""
try:
if name not in self.argspecs:
yield name
return
argspec = self.argspecs[name]
if argspec == []:
yield name + '()'
return
argument_lists = [self.arguments_for[arg] for arg in argspec]
arguments = [values[0] for values in argument_lists]
yield self._format_arguments(name, arguments)
# Dear Pylint, enumerate won't work here since we're modifying
# the array.
# pylint: disable=consider-using-enumerate
for i in range(len(arguments)):
for value in argument_lists[i][1:]:
arguments[i] = value
yield self._format_arguments(name, arguments)
arguments[i] = argument_lists[i][0]
except BaseException as e:
raise Exception('distribute_arguments({})'.format(name)) from e
def distribute_arguments_without_duplicates(
self, seen: Set[str], name: str
) -> Iterator[str]:
"""Same as `distribute_arguments`, but don't repeat seen results."""
for result in self.distribute_arguments(name):
if result not in seen:
seen.add(result)
yield result
def generate_expressions(self, names: Iterable[str]) -> Iterator[str]:
"""Generate expressions covering values constructed from the given names.
`names` can be any iterable collection of macro names.
For example:
* ``generate_expressions(['PSA_ALG_CMAC', 'PSA_ALG_HMAC'])``
generates ``'PSA_ALG_CMAC'`` as well as ``'PSA_ALG_HMAC(h)'`` for
every known hash algorithm ``h``.
* ``macros.generate_expressions(macros.key_types)`` generates all
key types.
"""
seen = set() #type: Set[str]
return itertools.chain(*(
self.distribute_arguments_without_duplicates(seen, name)
for name in names
))
class PSAMacroCollector(PSAMacroEnumerator):
"""Collect PSA crypto macro definitions from C header files.
"""
def __init__(self, include_intermediate: bool = False) -> None:
"""Set up an object to collect PSA macro definitions.
Call the read_file method of the constructed object on each header file.
* include_intermediate: if true, include intermediate macros such as
PSA_XXX_BASE that do not designate semantic values.
"""
super().__init__()
self.include_intermediate = include_intermediate
self.key_types_from_curve = {} #type: Dict[str, str]
self.key_types_from_group = {} #type: Dict[str, str]
self.algorithms_from_hash = {} #type: Dict[str, str]
@staticmethod
def algorithm_tester(name: str) -> str:
"""The predicate for whether an algorithm is built from the given constructor.
The given name must be the name of an algorithm constructor of the
form ``PSA_ALG_xxx`` which is used as ``PSA_ALG_xxx(yyy)`` to build
an algorithm value. Return the corresponding predicate macro which
is used as ``predicate(alg)`` to test whether ``alg`` can be built
as ``PSA_ALG_xxx(yyy)``. The predicate is usually called
``PSA_ALG_IS_xxx``.
"""
prefix = 'PSA_ALG_'
assert name.startswith(prefix)
midfix = 'IS_'
suffix = name[len(prefix):]
if suffix in ['DSA', 'ECDSA']:
midfix += 'RANDOMIZED_'
elif suffix == 'RSA_PSS':
suffix += '_STANDARD_SALT'
return prefix + midfix + suffix
def record_algorithm_subtype(self, name: str, expansion: str) -> None:
"""Record the subtype of an algorithm constructor.
Given a ``PSA_ALG_xxx`` macro name and its expansion, if the algorithm
is of a subtype that is tracked in its own set, add it to the relevant
set.
"""
# This code is very ad hoc and fragile. It should be replaced by
# something more robust.
if re.match(r'MAC(?:_|\Z)', name):
self.mac_algorithms.add(name)
elif re.match(r'KDF(?:_|\Z)', name):
self.kdf_algorithms.add(name)
elif re.search(r'0x020000[0-9A-Fa-f]{2}', expansion):
self.hash_algorithms.add(name)
elif re.search(r'0x03[0-9A-Fa-f]{6}', expansion):
self.mac_algorithms.add(name)
elif re.search(r'0x05[0-9A-Fa-f]{6}', expansion):
self.aead_algorithms.add(name)
elif re.search(r'0x09[0-9A-Fa-f]{2}0000', expansion):
self.ka_algorithms.add(name)
elif re.search(r'0x08[0-9A-Fa-f]{6}', expansion):
self.kdf_algorithms.add(name)
# "#define" followed by a macro name with either no parameters
# or a single parameter and a non-empty expansion.
# Grab the macro name in group 1, the parameter name if any in group 2
# and the expansion in group 3.
_define_directive_re = re.compile(r'\s*#\s*define\s+(\w+)' +
r'(?:\s+|\((\w+)\)\s*)' +
r'(.+)')
_deprecated_definition_re = re.compile(r'\s*MBEDTLS_DEPRECATED')
def read_line(self, line):
"""Parse a C header line and record the PSA identifier it defines if any.
This function analyzes lines that start with "#define PSA_"
(up to non-significant whitespace) and skips all non-matching lines.
"""
# pylint: disable=too-many-branches
m = re.match(self._define_directive_re, line)
if not m:
return
name, parameter, expansion = m.groups()
expansion = re.sub(r'/\*.*?\*/|//.*', r' ', expansion)
if parameter:
self.argspecs[name] = [parameter]
if re.match(self._deprecated_definition_re, expansion):
# Skip deprecated values, which are assumed to be
# backward compatibility aliases that share
# numerical values with non-deprecated values.
return
if self.is_internal_name(name):
# Macro only to build actual values
return
elif (name.startswith('PSA_ERROR_') or name == 'PSA_SUCCESS') \
and not parameter:
self.statuses.add(name)
elif name.startswith('PSA_KEY_TYPE_') and not parameter:
self.key_types.add(name)
elif name.startswith('PSA_KEY_TYPE_') and parameter == 'curve':
self.key_types_from_curve[name] = name[:13] + 'IS_' + name[13:]
elif name.startswith('PSA_KEY_TYPE_') and parameter == 'group':
self.key_types_from_group[name] = name[:13] + 'IS_' + name[13:]
elif name.startswith('PSA_ECC_FAMILY_') and not parameter:
self.ecc_curves.add(name)
elif name.startswith('PSA_DH_FAMILY_') and not parameter:
self.dh_groups.add(name)
elif name.startswith('PSA_ALG_') and not parameter:
if name in ['PSA_ALG_ECDSA_BASE',
'PSA_ALG_RSA_PKCS1V15_SIGN_BASE']:
# Ad hoc skipping of duplicate names for some numerical values
return
self.algorithms.add(name)
self.record_algorithm_subtype(name, expansion)
elif name.startswith('PSA_ALG_') and parameter == 'hash_alg':
self.algorithms_from_hash[name] = self.algorithm_tester(name)
elif name.startswith('PSA_KEY_USAGE_') and not parameter:
self.key_usage_flags.add(name)
else:
# Other macro without parameter
return
_nonascii_re = re.compile(rb'[^\x00-\x7f]+')
_continued_line_re = re.compile(rb'\\\r?\n\Z')
def read_file(self, header_file):
for line in header_file:
m = re.search(self._continued_line_re, line)
while m:
cont = next(header_file)
line = line[:m.start(0)] + cont
m = re.search(self._continued_line_re, line)
line = re.sub(self._nonascii_re, rb'', line).decode('ascii')
self.read_line(line)
class InputsForTest(PSAMacroEnumerator):
# pylint: disable=too-many-instance-attributes
"""Accumulate information about macros to test.
enumerate
This includes macro names as well as information about their arguments
when applicable.
"""
def __init__(self) -> None:
super().__init__()
self.all_declared = set() #type: Set[str]
# Identifier prefixes
self.table_by_prefix = {
'ERROR': self.statuses,
'ALG': self.algorithms,
'ECC_CURVE': self.ecc_curves,
'DH_GROUP': self.dh_groups,
'KEY_LIFETIME': self.lifetimes,
'KEY_LOCATION': self.locations,
'KEY_PERSISTENCE': self.persistence_levels,
'KEY_TYPE': self.key_types,
'KEY_USAGE': self.key_usage_flags,
} #type: Dict[str, Set[str]]
# Test functions
self.table_by_test_function = {
# Any function ending in _algorithm also gets added to
# self.algorithms.
'key_type': [self.key_types],
'block_cipher_key_type': [self.key_types],
'stream_cipher_key_type': [self.key_types],
'ecc_key_family': [self.ecc_curves],
'ecc_key_types': [self.ecc_curves],
'dh_key_family': [self.dh_groups],
'dh_key_types': [self.dh_groups],
'hash_algorithm': [self.hash_algorithms],
'mac_algorithm': [self.mac_algorithms],
'cipher_algorithm': [],
'hmac_algorithm': [self.mac_algorithms, self.sign_algorithms],
'aead_algorithm': [self.aead_algorithms],
'key_derivation_algorithm': [self.kdf_algorithms],
'key_agreement_algorithm': [self.ka_algorithms],
'asymmetric_signature_algorithm': [self.sign_algorithms],
'asymmetric_signature_wildcard': [self.algorithms],
'asymmetric_encryption_algorithm': [],
'pake_algorithm': [self.pake_algorithms],
'other_algorithm': [],
'lifetime': [self.lifetimes],
} #type: Dict[str, List[Set[str]]]
mac_lengths = [str(n) for n in [
1, # minimum expressible
4, # minimum allowed by policy
13, # an odd size in a plausible range
14, # an even non-power-of-two size in a plausible range
16, # same as full size for at least one algorithm
63, # maximum expressible
]]
self.arguments_for['mac_length'] += mac_lengths
self.arguments_for['min_mac_length'] += mac_lengths
aead_lengths = [str(n) for n in [
1, # minimum expressible
4, # minimum allowed by policy
13, # an odd size in a plausible range
14, # an even non-power-of-two size in a plausible range
16, # same as full size for at least one algorithm
63, # maximum expressible
]]
self.arguments_for['tag_length'] += aead_lengths
self.arguments_for['min_tag_length'] += aead_lengths
def add_numerical_values(self) -> None:
"""Add numerical values that are not supported to the known identifiers."""
# Sets of names per type
self.algorithms.add('0xffffffff')
self.ecc_curves.add('0xff')
self.dh_groups.add('0xff')
self.key_types.add('0xffff')
self.key_usage_flags.add('0x80000000')
# Hard-coded values for unknown algorithms
#
# These have to have values that are correct for their respective
# PSA_ALG_IS_xxx macros, but are also not currently assigned and are
# not likely to be assigned in the near future.
self.hash_algorithms.add('0x020000fe') # 0x020000ff is PSA_ALG_ANY_HASH
self.mac_algorithms.add('0x03007fff')
self.ka_algorithms.add('0x09fc0000')
self.kdf_algorithms.add('0x080000ff')
self.pake_algorithms.add('0x0a0000ff')
# For AEAD algorithms, the only variability is over the tag length,
# and this only applies to known algorithms, so don't test an
# unknown algorithm.
def get_names(self, type_word: str) -> Set[str]:
"""Return the set of known names of values of the given type."""
return {
'status': self.statuses,
'algorithm': self.algorithms,
'ecc_curve': self.ecc_curves,
'dh_group': self.dh_groups,
'key_type': self.key_types,
'key_usage': self.key_usage_flags,
}[type_word]
# Regex for interesting header lines.
# Groups: 1=macro name, 2=type, 3=argument list (optional).
_header_line_re = \
re.compile(r'#define +' +
r'(PSA_((?:(?:DH|ECC|KEY)_)?[A-Z]+)_\w+)' +
r'(?:\(([^\n()]*)\))?')
# Regex of macro names to exclude.
_excluded_name_re = re.compile(r'_(?:GET|IS|OF)_|_(?:BASE|FLAG|MASK)\Z')
# Additional excluded macros.
_excluded_names = set([
# Macros that provide an alternative way to build the same
# algorithm as another macro.
'PSA_ALG_AEAD_WITH_DEFAULT_LENGTH_TAG',
'PSA_ALG_FULL_LENGTH_MAC',
# Auxiliary macro whose name doesn't fit the usual patterns for
# auxiliary macros.
'PSA_ALG_AEAD_WITH_DEFAULT_LENGTH_TAG_CASE',
])
def parse_header_line(self, line: str) -> None:
"""Parse a C header line, looking for "#define PSA_xxx"."""
m = re.match(self._header_line_re, line)
if not m:
return
name = m.group(1)
self.all_declared.add(name)
if re.search(self._excluded_name_re, name) or \
name in self._excluded_names or \
self.is_internal_name(name):
return
dest = self.table_by_prefix.get(m.group(2))
if dest is None:
return
dest.add(name)
if m.group(3):
self.argspecs[name] = self._argument_split(m.group(3))
_nonascii_re = re.compile(rb'[^\x00-\x7f]+') #type: Pattern
def parse_header(self, filename: str) -> None:
"""Parse a C header file, looking for "#define PSA_xxx"."""
with read_file_lines(filename, binary=True) as lines:
for line in lines:
line = re.sub(self._nonascii_re, rb'', line).decode('ascii')
self.parse_header_line(line)
_macro_identifier_re = re.compile(r'[A-Z]\w+')
def generate_undeclared_names(self, expr: str) -> Iterable[str]:
for name in re.findall(self._macro_identifier_re, expr):
if name not in self.all_declared:
yield name
def accept_test_case_line(self, function: str, argument: str) -> bool:
#pylint: disable=unused-argument
undeclared = list(self.generate_undeclared_names(argument))
if undeclared:
raise Exception('Undeclared names in test case', undeclared)
return True
@staticmethod
def normalize_argument(argument: str) -> str:
"""Normalize whitespace in the given C expression.
The result uses the same whitespace as
` PSAMacroEnumerator.distribute_arguments`.
"""
return re.sub(r',', r', ', re.sub(r' +', r'', argument))
def add_test_case_line(self, function: str, argument: str) -> None:
"""Parse a test case data line, looking for algorithm metadata tests."""
sets = []
if function.endswith('_algorithm'):
sets.append(self.algorithms)
if function == 'key_agreement_algorithm' and \
argument.startswith('PSA_ALG_KEY_AGREEMENT('):
# We only want *raw* key agreement algorithms as such, so
# exclude ones that are already chained with a KDF.
# Keep the expression as one to test as an algorithm.
function = 'other_algorithm'
sets += self.table_by_test_function[function]
if self.accept_test_case_line(function, argument):
for s in sets:
s.add(self.normalize_argument(argument))
# Regex matching a *.data line containing a test function call and
# its arguments. The actual definition is partly positional, but this
# regex is good enough in practice.
_test_case_line_re = re.compile(r'(?!depends_on:)(\w+):([^\n :][^:\n]*)')
def parse_test_cases(self, filename: str) -> None:
"""Parse a test case file (*.data), looking for algorithm metadata tests."""
with read_file_lines(filename) as lines:
for line in lines:
m = re.match(self._test_case_line_re, line)
if m:
self.add_test_case_line(m.group(1), m.group(2))

View file

@ -0,0 +1,161 @@
"""Collect information about PSA cryptographic mechanisms.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import re
from collections import OrderedDict
from typing import FrozenSet, List, Optional
from . import macro_collector
class Information:
"""Gather information about PSA constructors."""
def __init__(self) -> None:
self.constructors = self.read_psa_interface()
@staticmethod
def remove_unwanted_macros(
constructors: macro_collector.PSAMacroEnumerator
) -> None:
# Mbed TLS does not support finite-field DSA.
# Don't attempt to generate any related test case.
constructors.key_types.discard('PSA_KEY_TYPE_DSA_KEY_PAIR')
constructors.key_types.discard('PSA_KEY_TYPE_DSA_PUBLIC_KEY')
def read_psa_interface(self) -> macro_collector.PSAMacroEnumerator:
"""Return the list of known key types, algorithms, etc."""
constructors = macro_collector.InputsForTest()
header_file_names = ['include/psa/crypto_values.h',
'include/psa/crypto_extra.h']
test_suites = ['tests/suites/test_suite_psa_crypto_metadata.data']
for header_file_name in header_file_names:
constructors.parse_header(header_file_name)
for test_cases in test_suites:
constructors.parse_test_cases(test_cases)
self.remove_unwanted_macros(constructors)
constructors.gather_arguments()
return constructors
def psa_want_symbol(name: str) -> str:
"""Return the PSA_WANT_xxx symbol associated with a PSA crypto feature."""
if name.startswith('PSA_'):
return name[:4] + 'WANT_' + name[4:]
else:
raise ValueError('Unable to determine the PSA_WANT_ symbol for ' + name)
def finish_family_dependency(dep: str, bits: int) -> str:
"""Finish dep if it's a family dependency symbol prefix.
A family dependency symbol prefix is a PSA_WANT_ symbol that needs to be
qualified by the key size. If dep is such a symbol, finish it by adjusting
the prefix and appending the key size. Other symbols are left unchanged.
"""
return re.sub(r'_FAMILY_(.*)', r'_\1_' + str(bits), dep)
def finish_family_dependencies(dependencies: List[str], bits: int) -> List[str]:
"""Finish any family dependency symbol prefixes.
Apply `finish_family_dependency` to each element of `dependencies`.
"""
return [finish_family_dependency(dep, bits) for dep in dependencies]
SYMBOLS_WITHOUT_DEPENDENCY = frozenset([
'PSA_ALG_AEAD_WITH_AT_LEAST_THIS_LENGTH_TAG', # modifier, only in policies
'PSA_ALG_AEAD_WITH_SHORTENED_TAG', # modifier
'PSA_ALG_ANY_HASH', # only in policies
'PSA_ALG_AT_LEAST_THIS_LENGTH_MAC', # modifier, only in policies
'PSA_ALG_KEY_AGREEMENT', # chaining
'PSA_ALG_TRUNCATED_MAC', # modifier
])
def automatic_dependencies(*expressions: str) -> List[str]:
"""Infer dependencies of a test case by looking for PSA_xxx symbols.
The arguments are strings which should be C expressions. Do not use
string literals or comments as this function is not smart enough to
skip them.
"""
used = set()
for expr in expressions:
used.update(re.findall(r'PSA_(?:ALG|ECC_FAMILY|DH_FAMILY|KEY_TYPE)_\w+', expr))
used.difference_update(SYMBOLS_WITHOUT_DEPENDENCY)
return sorted(psa_want_symbol(name) for name in used)
# Define set of regular expressions and dependencies to optionally append
# extra dependencies for test case based on key description.
# Skip AES test cases which require 192- or 256-bit key
# if MBEDTLS_AES_ONLY_128_BIT_KEY_LENGTH defined
AES_128BIT_ONLY_DEP_REGEX = re.compile(r'AES\s(192|256)')
AES_128BIT_ONLY_DEP = ['!MBEDTLS_AES_ONLY_128_BIT_KEY_LENGTH']
# Skip AES/ARIA/CAMELLIA test cases which require decrypt operation in ECB mode
# if MBEDTLS_BLOCK_CIPHER_NO_DECRYPT enabled.
ECB_NO_PADDING_DEP_REGEX = re.compile(r'(AES|ARIA|CAMELLIA).*ECB_NO_PADDING')
ECB_NO_PADDING_DEP = ['!MBEDTLS_BLOCK_CIPHER_NO_DECRYPT']
DEPENDENCY_FROM_DESCRIPTION = OrderedDict()
DEPENDENCY_FROM_DESCRIPTION[AES_128BIT_ONLY_DEP_REGEX] = AES_128BIT_ONLY_DEP
DEPENDENCY_FROM_DESCRIPTION[ECB_NO_PADDING_DEP_REGEX] = ECB_NO_PADDING_DEP
def generate_deps_from_description(
description: str
) -> List[str]:
"""Return additional dependencies based on test case description and REGEX.
"""
dep_list = []
for regex, deps in DEPENDENCY_FROM_DESCRIPTION.items():
if re.search(regex, description):
dep_list += deps
return dep_list
# A temporary hack: at the time of writing, not all dependency symbols
# are implemented yet. Skip test cases for which the dependency symbols are
# not available. Once all dependency symbols are available, this hack must
# be removed so that a bug in the dependency symbols properly leads to a test
# failure.
def read_implemented_dependencies(filename: str) -> FrozenSet[str]:
return frozenset(symbol
for line in open(filename)
for symbol in re.findall(r'\bPSA_WANT_\w+\b', line))
_implemented_dependencies = None #type: Optional[FrozenSet[str]] #pylint: disable=invalid-name
def hack_dependencies_not_implemented(dependencies: List[str]) -> None:
global _implemented_dependencies #pylint: disable=global-statement,invalid-name
if _implemented_dependencies is None:
_implemented_dependencies = \
read_implemented_dependencies('include/psa/crypto_config.h')
if not all((dep.lstrip('!') in _implemented_dependencies or
not dep.lstrip('!').startswith('PSA_WANT'))
for dep in dependencies):
dependencies.append('DEPENDENCY_NOT_IMPLEMENTED_YET')
def tweak_key_pair_dependency(dep: str, usage: str):
"""
This helper function add the proper suffix to PSA_WANT_KEY_TYPE_xxx_KEY_PAIR
symbols according to the required usage.
"""
ret_list = list()
if dep.endswith('KEY_PAIR'):
if usage == "BASIC":
# BASIC automatically includes IMPORT and EXPORT for test purposes (see
# config_psa.h).
ret_list.append(re.sub(r'KEY_PAIR', r'KEY_PAIR_BASIC', dep))
ret_list.append(re.sub(r'KEY_PAIR', r'KEY_PAIR_IMPORT', dep))
ret_list.append(re.sub(r'KEY_PAIR', r'KEY_PAIR_EXPORT', dep))
elif usage == "GENERATE":
ret_list.append(re.sub(r'KEY_PAIR', r'KEY_PAIR_GENERATE', dep))
else:
# No replacement to do in this case
ret_list.append(dep)
return ret_list
def fix_key_pair_dependencies(dep_list: List[str], usage: str):
new_list = [new_deps
for dep in dep_list
for new_deps in tweak_key_pair_dependency(dep, usage)]
return new_list

View file

@ -0,0 +1,206 @@
"""Knowledge about the PSA key store as implemented in Mbed TLS.
Note that if you need to make a change that affects how keys are
stored, this may indicate that the key store is changing in a
backward-incompatible way! Think carefully about backward compatibility
before changing how test data is constructed or validated.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import re
import struct
from typing import Dict, List, Optional, Set, Union
import unittest
from . import c_build_helper
from . import build_tree
class Expr:
"""Representation of a C expression with a known or knowable numerical value."""
def __init__(self, content: Union[int, str]):
if isinstance(content, int):
digits = 8 if content > 0xffff else 4
self.string = '{0:#0{1}x}'.format(content, digits + 2)
self.value_if_known = content #type: Optional[int]
else:
self.string = content
self.unknown_values.add(self.normalize(content))
self.value_if_known = None
value_cache = {} #type: Dict[str, int]
"""Cache of known values of expressions."""
unknown_values = set() #type: Set[str]
"""Expressions whose values are not present in `value_cache` yet."""
def update_cache(self) -> None:
"""Update `value_cache` for expressions registered in `unknown_values`."""
expressions = sorted(self.unknown_values)
includes = ['include']
if build_tree.looks_like_tf_psa_crypto_root('.'):
includes.append('drivers/builtin/include')
values = c_build_helper.get_c_expression_values(
'unsigned long', '%lu',
expressions,
header="""
#include <psa/crypto.h>
""",
include_path=includes) #type: List[str]
for e, v in zip(expressions, values):
self.value_cache[e] = int(v, 0)
self.unknown_values.clear()
@staticmethod
def normalize(string: str) -> str:
"""Put the given C expression in a canonical form.
This function is only intended to give correct results for the
relatively simple kind of C expression typically used with this
module.
"""
return re.sub(r'\s+', r'', string)
def value(self) -> int:
"""Return the numerical value of the expression."""
if self.value_if_known is None:
if re.match(r'([0-9]+|0x[0-9a-f]+)\Z', self.string, re.I):
return int(self.string, 0)
normalized = self.normalize(self.string)
if normalized not in self.value_cache:
self.update_cache()
self.value_if_known = self.value_cache[normalized]
return self.value_if_known
Exprable = Union[str, int, Expr]
"""Something that can be converted to a C expression with a known numerical value."""
def as_expr(thing: Exprable) -> Expr:
"""Return an `Expr` object for `thing`.
If `thing` is already an `Expr` object, return it. Otherwise build a new
`Expr` object from `thing`. `thing` can be an integer or a string that
contains a C expression.
"""
if isinstance(thing, Expr):
return thing
else:
return Expr(thing)
class Key:
"""Representation of a PSA crypto key object and its storage encoding.
"""
LATEST_VERSION = 0
"""The latest version of the storage format."""
def __init__(self, *,
version: Optional[int] = None,
id: Optional[int] = None, #pylint: disable=redefined-builtin
lifetime: Exprable = 'PSA_KEY_LIFETIME_PERSISTENT',
type: Exprable, #pylint: disable=redefined-builtin
bits: int,
usage: Exprable, alg: Exprable, alg2: Exprable,
material: bytes #pylint: disable=used-before-assignment
) -> None:
self.version = self.LATEST_VERSION if version is None else version
self.id = id #pylint: disable=invalid-name #type: Optional[int]
self.lifetime = as_expr(lifetime) #type: Expr
self.type = as_expr(type) #type: Expr
self.bits = bits #type: int
self.usage = as_expr(usage) #type: Expr
self.alg = as_expr(alg) #type: Expr
self.alg2 = as_expr(alg2) #type: Expr
self.material = material #type: bytes
MAGIC = b'PSA\000KEY\000'
@staticmethod
def pack(
fmt: str,
*args: Union[int, Expr]
) -> bytes: #pylint: disable=used-before-assignment
"""Pack the given arguments into a byte string according to the given format.
This function is similar to `struct.pack`, but with the following differences:
* All integer values are encoded with standard sizes and in
little-endian representation. `fmt` must not include an endianness
prefix.
* Arguments can be `Expr` objects instead of integers.
* Only integer-valued elements are supported.
"""
return struct.pack('<' + fmt, # little-endian, standard sizes
*[arg.value() if isinstance(arg, Expr) else arg
for arg in args])
def bytes(self) -> bytes:
"""Return the representation of the key in storage as a byte array.
This is the content of the PSA storage file. When PSA storage is
implemented over stdio files, this does not include any wrapping made
by the PSA-storage-over-stdio-file implementation.
Note that if you need to make a change in this function,
this may indicate that the key store is changing in a
backward-incompatible way! Think carefully about backward
compatibility before making any change here.
"""
header = self.MAGIC + self.pack('L', self.version)
if self.version == 0:
attributes = self.pack('LHHLLL',
self.lifetime, self.type, self.bits,
self.usage, self.alg, self.alg2)
material = self.pack('L', len(self.material)) + self.material
else:
raise NotImplementedError
return header + attributes + material
def hex(self) -> str:
"""Return the representation of the key as a hexadecimal string.
This is the hexadecimal representation of `self.bytes`.
"""
return self.bytes().hex()
def location_value(self) -> int:
"""The numerical value of the location encoded in the key's lifetime."""
return self.lifetime.value() >> 8
class TestKey(unittest.TestCase):
# pylint: disable=line-too-long
"""A few smoke tests for the functionality of the `Key` class."""
def test_numerical(self):
key = Key(version=0,
id=1, lifetime=0x00000001,
type=0x2400, bits=128,
usage=0x00000300, alg=0x05500200, alg2=0x04c01000,
material=b'@ABCDEFGHIJKLMNO')
expected_hex = '505341004b45590000000000010000000024800000030000000250050010c00410000000404142434445464748494a4b4c4d4e4f'
self.assertEqual(key.bytes(), bytes.fromhex(expected_hex))
self.assertEqual(key.hex(), expected_hex)
def test_names(self):
length = 0xfff8 // 8 # PSA_MAX_KEY_BITS in bytes
key = Key(version=0,
id=1, lifetime='PSA_KEY_LIFETIME_PERSISTENT',
type='PSA_KEY_TYPE_RAW_DATA', bits=length*8,
usage=0, alg=0, alg2=0,
material=b'\x00' * length)
expected_hex = '505341004b45590000000000010000000110f8ff000000000000000000000000ff1f0000' + '00' * length
self.assertEqual(key.bytes(), bytes.fromhex(expected_hex))
self.assertEqual(key.hex(), expected_hex)
def test_defaults(self):
key = Key(type=0x1001, bits=8,
usage=0, alg=0, alg2=0,
material=b'\x2a')
expected_hex = '505341004b455900000000000100000001100800000000000000000000000000010000002a'
self.assertEqual(key.bytes(), bytes.fromhex(expected_hex))
self.assertEqual(key.hex(), expected_hex)

View file

@ -0,0 +1,91 @@
"""Library for constructing an Mbed TLS test case.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import binascii
import os
import sys
from typing import Iterable, List, Optional
from . import typing_util
def hex_string(data: bytes) -> str:
return '"' + binascii.hexlify(data).decode('ascii') + '"'
class MissingDescription(Exception):
pass
class MissingFunction(Exception):
pass
class TestCase:
"""An Mbed TLS test case."""
def __init__(self, description: Optional[str] = None):
self.comments = [] #type: List[str]
self.description = description #type: Optional[str]
self.dependencies = [] #type: List[str]
self.function = None #type: Optional[str]
self.arguments = [] #type: List[str]
def add_comment(self, *lines: str) -> None:
self.comments += lines
def set_description(self, description: str) -> None:
self.description = description
def set_dependencies(self, dependencies: List[str]) -> None:
self.dependencies = dependencies
def set_function(self, function: str) -> None:
self.function = function
def set_arguments(self, arguments: List[str]) -> None:
self.arguments = arguments
def check_completeness(self) -> None:
if self.description is None:
raise MissingDescription
if self.function is None:
raise MissingFunction
def write(self, out: typing_util.Writable) -> None:
"""Write the .data file paragraph for this test case.
The output starts and ends with a single newline character. If the
surrounding code writes lines (consisting of non-newline characters
and a final newline), you will end up with a blank line before, but
not after the test case.
"""
self.check_completeness()
assert self.description is not None # guide mypy
assert self.function is not None # guide mypy
out.write('\n')
for line in self.comments:
out.write('# ' + line + '\n')
out.write(self.description + '\n')
if self.dependencies:
out.write('depends_on:' + ':'.join(self.dependencies) + '\n')
out.write(self.function + ':' + ':'.join(self.arguments) + '\n')
def write_data_file(filename: str,
test_cases: Iterable[TestCase],
caller: Optional[str] = None) -> None:
"""Write the test cases to the specified file.
If the file already exists, it is overwritten.
"""
if caller is None:
caller = os.path.basename(sys.argv[0])
tempfile = filename + '.new'
with open(tempfile, 'w') as out:
out.write('# Automatically generated by {}. Do not edit!\n'
.format(caller))
for tc in test_cases:
tc.write(out)
out.write('\n# End of automatically generated file.\n')
os.replace(tempfile, filename)

View file

@ -0,0 +1,224 @@
"""Common code for test data generation.
This module defines classes that are of general use to automatically
generate .data files for unit tests, as well as a main function.
These are used both by generate_psa_tests.py and generate_bignum_tests.py.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
import argparse
import os
import posixpath
import re
import inspect
from abc import ABCMeta, abstractmethod
from typing import Callable, Dict, Iterable, Iterator, List, Type, TypeVar
from . import build_tree
from . import test_case
T = TypeVar('T') #pylint: disable=invalid-name
class BaseTest(metaclass=ABCMeta):
"""Base class for test case generation.
Attributes:
count: Counter for test cases from this class.
case_description: Short description of the test case. This may be
automatically generated using the class, or manually set.
dependencies: A list of dependencies required for the test case.
show_test_count: Toggle for inclusion of `count` in the test description.
test_function: Test function which the class generates cases for.
test_name: A common name or description of the test function. This can
be `test_function`, a clearer equivalent, or a short summary of the
test function's purpose.
"""
count = 0
case_description = ""
dependencies = [] # type: List[str]
show_test_count = True
test_function = ""
test_name = ""
def __new__(cls, *args, **kwargs):
# pylint: disable=unused-argument
cls.count += 1
return super().__new__(cls)
@abstractmethod
def arguments(self) -> List[str]:
"""Get the list of arguments for the test case.
Override this method to provide the list of arguments required for
the `test_function`.
Returns:
List of arguments required for the test function.
"""
raise NotImplementedError
def description(self) -> str:
"""Create a test case description.
Creates a description of the test case, including a name for the test
function, an optional case count, and a description of the specific
test case. This should inform a reader what is being tested, and
provide context for the test case.
Returns:
Description for the test case.
"""
if self.show_test_count:
return "{} #{} {}".format(
self.test_name, self.count, self.case_description
).strip()
else:
return "{} {}".format(self.test_name, self.case_description).strip()
def create_test_case(self) -> test_case.TestCase:
"""Generate TestCase from the instance."""
tc = test_case.TestCase()
tc.set_description(self.description())
tc.set_function(self.test_function)
tc.set_arguments(self.arguments())
tc.set_dependencies(self.dependencies)
return tc
@classmethod
@abstractmethod
def generate_function_tests(cls) -> Iterator[test_case.TestCase]:
"""Generate test cases for the class test function.
This will be called in classes where `test_function` is set.
Implementations should yield TestCase objects, by creating instances
of the class with appropriate input data, and then calling
`create_test_case()` on each.
"""
raise NotImplementedError
class BaseTarget:
#pylint: disable=too-few-public-methods
"""Base target for test case generation.
Child classes of this class represent an output file, and can be referred
to as file targets. These indicate where test cases will be written to for
all subclasses of the file target, which is set by `target_basename`.
Attributes:
target_basename: Basename of file to write generated tests to. This
should be specified in a child class of BaseTarget.
"""
target_basename = ""
@classmethod
def generate_tests(cls) -> Iterator[test_case.TestCase]:
"""Generate test cases for the class and its subclasses.
In classes with `test_function` set, `generate_function_tests()` is
called to generate test cases first.
In all classes, this method will iterate over its subclasses, and
yield from `generate_tests()` in each. Calling this method on a class X
will yield test cases from all classes derived from X.
"""
if issubclass(cls, BaseTest) and not inspect.isabstract(cls):
#pylint: disable=no-member
yield from cls.generate_function_tests()
for subclass in sorted(cls.__subclasses__(), key=lambda c: c.__name__):
yield from subclass.generate_tests()
class TestGenerator:
"""Generate test cases and write to data files."""
def __init__(self, options) -> None:
self.test_suite_directory = options.directory
# Update `targets` with an entry for each child class of BaseTarget.
# Each entry represents a file generated by the BaseTarget framework,
# and enables generating the .data files using the CLI.
self.targets.update({
subclass.target_basename: subclass.generate_tests
for subclass in BaseTarget.__subclasses__()
if subclass.target_basename
})
def filename_for(self, basename: str) -> str:
"""The location of the data file with the specified base name."""
return posixpath.join(self.test_suite_directory, basename + '.data')
def write_test_data_file(self, basename: str,
test_cases: Iterable[test_case.TestCase]) -> None:
"""Write the test cases to a .data file.
The output file is ``basename + '.data'`` in the test suite directory.
"""
filename = self.filename_for(basename)
test_case.write_data_file(filename, test_cases)
# Note that targets whose names contain 'test_format' have their content
# validated by `abi_check.py`.
targets = {} # type: Dict[str, Callable[..., Iterable[test_case.TestCase]]]
def generate_target(self, name: str, *target_args) -> None:
"""Generate cases and write to data file for a target.
For target callables which require arguments, override this function
and pass these arguments using super() (see PSATestGenerator).
"""
test_cases = self.targets[name](*target_args)
self.write_test_data_file(name, test_cases)
def main(args, description: str, generator_class: Type[TestGenerator] = TestGenerator):
"""Command line entry point."""
parser = argparse.ArgumentParser(description=description)
parser.add_argument('--list', action='store_true',
help='List available targets and exit')
parser.add_argument('--list-for-cmake', action='store_true',
help='Print \';\'-separated list of available targets and exit')
# If specified explicitly, this option may be a path relative to the
# current directory when the script is invoked. The default value
# is relative to the mbedtls root, which we don't know yet. So we
# can't set a string as the default value here.
parser.add_argument('--directory', metavar='DIR',
help='Output directory (default: tests/suites)')
parser.add_argument('targets', nargs='*', metavar='TARGET',
help='Target file to generate (default: all; "-": none)')
options = parser.parse_args(args)
# Change to the mbedtls root, to keep things simple. But first, adjust
# command line options that might be relative paths.
if options.directory is None:
options.directory = 'tests/suites'
else:
options.directory = os.path.abspath(options.directory)
build_tree.chdir_to_root()
generator = generator_class(options)
if options.list:
for name in sorted(generator.targets):
print(generator.filename_for(name))
return
# List in a cmake list format (i.e. ';'-separated)
if options.list_for_cmake:
print(';'.join(generator.filename_for(name)
for name in sorted(generator.targets)), end='')
return
if options.targets:
# Allow "-" as a special case so you can run
# ``generate_xxx_tests.py - $targets`` and it works uniformly whether
# ``$targets`` is empty or not.
options.targets = [os.path.basename(re.sub(r'\.data\Z', r'', target))
for target in options.targets
if target != '-']
else:
options.targets = sorted(generator.targets)
for target in options.targets:
generator.generate_target(target)

View file

@ -0,0 +1,28 @@
"""Auxiliary definitions used in type annotations.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
from typing import Any
# The typing_extensions module is necessary for type annotations that are
# checked with mypy. It is only used for type annotations or to define
# things that are themselves only used for type annotations. It is not
# available on a default Python installation. Therefore, try loading
# what we need from it for the sake of mypy (which depends on, or comes
# with, typing_extensions), and if not define substitutes that lack the
# static type information but are good enough at runtime.
try:
from typing_extensions import Protocol #pylint: disable=import-error
except ImportError:
class Protocol: #type: ignore
#pylint: disable=too-few-public-methods
pass
class Writable(Protocol):
"""Abstract class for typing hints."""
# pylint: disable=no-self-use,too-few-public-methods,unused-argument
def write(self, text: str) -> Any:
...

129
externals/mbedtls/scripts/memory.sh vendored Executable file
View file

@ -0,0 +1,129 @@
#!/bin/sh
# Measure memory usage of a minimal client using a small configuration
# Currently hardwired to ccm-psk and suite-b, may be expanded later
#
# Use different build options for measuring executable size and memory usage,
# since for memory we want debug information.
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
set -eu
CONFIG_H='include/mbedtls/mbedtls_config.h'
CLIENT='mini_client'
CFLAGS_EXEC='-fno-asynchronous-unwind-tables -Wl,--gc-section -ffunction-sections -fdata-sections'
CFLAGS_MEM=-g3
if [ -r $CONFIG_H ]; then :; else
echo "$CONFIG_H not found" >&2
exit 1
fi
if grep -i cmake Makefile >/dev/null; then
echo "Not compatible with CMake" >&2
exit 1
fi
if [ $( uname ) != Linux ]; then
echo "Only work on Linux" >&2
exit 1
fi
if git status | grep -F $CONFIG_H >/dev/null 2>&1; then
echo "mbedtls_config.h not clean" >&2
exit 1
fi
# make measurements with one configuration
# usage: do_config <name> <unset-list> <server-args>
do_config()
{
NAME=$1
UNSET_LIST=$2
SERVER_ARGS=$3
echo ""
echo "config-$NAME:"
cp configs/config-$NAME.h $CONFIG_H
scripts/config.py unset MBEDTLS_SSL_SRV_C
for FLAG in $UNSET_LIST; do
scripts/config.py unset $FLAG
done
grep -F SSL_MAX_CONTENT_LEN $CONFIG_H || echo 'SSL_MAX_CONTENT_LEN=16384'
printf " Executable size... "
make clean
CFLAGS=$CFLAGS_EXEC make OFLAGS=-Os lib >/dev/null 2>&1
cd programs
CFLAGS=$CFLAGS_EXEC make OFLAGS=-Os ssl/$CLIENT >/dev/null
strip ssl/$CLIENT
stat -c '%s' ssl/$CLIENT
cd ..
printf " Peak ram usage... "
make clean
CFLAGS=$CFLAGS_MEM make OFLAGS=-Os lib >/dev/null 2>&1
cd programs
CFLAGS=$CFLAGS_MEM make OFLAGS=-Os ssl/$CLIENT >/dev/null
cd ..
./ssl_server2 $SERVER_ARGS >/dev/null &
SRV_PID=$!
sleep 1;
if valgrind --tool=massif --stacks=yes programs/ssl/$CLIENT >/dev/null 2>&1
then
FAILED=0
else
echo "client failed" >&2
FAILED=1
fi
kill $SRV_PID
wait $SRV_PID
scripts/massif_max.pl massif.out.*
mv massif.out.* massif-$NAME.$$
}
# preparation
CONFIG_BAK=${CONFIG_H}.bak
cp $CONFIG_H $CONFIG_BAK
rm -f massif.out.*
printf "building server... "
make clean
make lib >/dev/null 2>&1
(cd programs && make ssl/ssl_server2) >/dev/null
cp programs/ssl/ssl_server2 .
echo "done"
# actual measurements
do_config "ccm-psk-tls1_2" \
"" \
"psk=000102030405060708090A0B0C0D0E0F"
do_config "suite-b" \
"MBEDTLS_BASE64_C MBEDTLS_PEM_PARSE_C" \
""
# cleanup
mv $CONFIG_BAK $CONFIG_H
make clean
rm ssl_server2
exit $FAILED

127
externals/mbedtls/scripts/min_requirements.py vendored Executable file
View file

@ -0,0 +1,127 @@
#!/usr/bin/env python3
"""Install all the required Python packages, with the minimum Python version.
"""
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
import argparse
import os
import re
import subprocess
import sys
import tempfile
import typing
from typing import List, Optional
from mbedtls_dev import typing_util
def pylint_doesn_t_notice_that_certain_types_are_used_in_annotations(
_list: List[typing.Any],
) -> None:
pass
class Requirements:
"""Collect and massage Python requirements."""
def __init__(self) -> None:
self.requirements = [] #type: List[str]
def adjust_requirement(self, req: str) -> str:
"""Adjust a requirement to the minimum specified version."""
# allow inheritance #pylint: disable=no-self-use
# If a requirement specifies a minimum version, impose that version.
split_req = req.split(';', 1)
split_req[0] = re.sub(r'>=|~=', r'==', split_req[0])
return ';'.join(split_req)
def add_file(self, filename: str) -> None:
"""Add requirements from the specified file.
This method supports a subset of pip's requirement file syntax:
* One requirement specifier per line, which is passed to
`adjust_requirement`.
* Comments (``#`` at the beginning of the line or after whitespace).
* ``-r FILENAME`` to include another file.
"""
for line in open(filename):
line = line.strip()
line = re.sub(r'(\A|\s+)#.*', r'', line)
if not line:
continue
m = re.match(r'-r\s+', line)
if m:
nested_file = os.path.join(os.path.dirname(filename),
line[m.end(0):])
self.add_file(nested_file)
continue
self.requirements.append(self.adjust_requirement(line))
def write(self, out: typing_util.Writable) -> None:
"""List the gathered requirements."""
for req in self.requirements:
out.write(req + '\n')
def install(
self,
pip_general_options: Optional[List[str]] = None,
pip_install_options: Optional[List[str]] = None,
) -> None:
"""Call pip to install the requirements."""
if pip_general_options is None:
pip_general_options = []
if pip_install_options is None:
pip_install_options = []
with tempfile.TemporaryDirectory() as temp_dir:
# This is more complicated than it needs to be for the sake
# of Windows. Use a temporary file rather than the command line
# to avoid quoting issues. Use a temporary directory rather
# than NamedTemporaryFile because with a NamedTemporaryFile on
# Windows, the subprocess can't open the file because this process
# has an exclusive lock on it.
req_file_name = os.path.join(temp_dir, 'requirements.txt')
with open(req_file_name, 'w') as req_file:
self.write(req_file)
subprocess.check_call([sys.executable, '-m', 'pip'] +
pip_general_options +
['install'] + pip_install_options +
['-r', req_file_name])
DEFAULT_REQUIREMENTS_FILE = 'ci.requirements.txt'
def main() -> None:
"""Command line entry point."""
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument('--no-act', '-n',
action='store_true',
help="Don't act, just print what will be done")
parser.add_argument('--pip-install-option',
action='append', dest='pip_install_options',
help="Pass this option to pip install")
parser.add_argument('--pip-option',
action='append', dest='pip_general_options',
help="Pass this general option to pip")
parser.add_argument('--user',
action='append_const', dest='pip_install_options',
const='--user',
help="Install to the Python user install directory"
" (short for --pip-install-option --user)")
parser.add_argument('files', nargs='*', metavar='FILE',
help="Requirement files"
" (default: {} in the script's directory)" \
.format(DEFAULT_REQUIREMENTS_FILE))
options = parser.parse_args()
if not options.files:
options.files = [os.path.join(os.path.dirname(__file__),
DEFAULT_REQUIREMENTS_FILE)]
reqs = Requirements()
for filename in options.files:
reqs.add_file(filename)
reqs.write(sys.stdout)
if not options.no_act:
reqs.install(pip_general_options=options.pip_general_options,
pip_install_options=options.pip_install_options)
if __name__ == '__main__':
main()

187
externals/mbedtls/scripts/output_env.sh vendored Executable file
View file

@ -0,0 +1,187 @@
#! /usr/bin/env sh
# output_env.sh
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
#
# Purpose
#
# To print out all the relevant information about the development environment.
#
# This includes:
# - architecture of the system
# - type and version of the operating system
# - version of make and cmake
# - version of armcc, clang, gcc-arm and gcc compilers
# - version of libc, clang, asan and valgrind if installed
# - version of gnuTLS and OpenSSL
print_version()
{
BIN="$1"
shift
ARGS="$1"
shift
VARIANT="$1"
shift
if [ -n "$VARIANT" ]; then
VARIANT=" ($VARIANT)"
fi
if ! type "$BIN" > /dev/null 2>&1; then
echo " * ${BIN##*/}$VARIANT: Not found."
return 0
fi
BIN=`which "$BIN"`
VERSION_STR=`$BIN $ARGS 2>&1`
# Apply all filters
while [ $# -gt 0 ]; do
FILTER="$1"
shift
VERSION_STR=`echo "$VERSION_STR" | $FILTER`
done
if [ -z "$VERSION_STR" ]; then
VERSION_STR="Version could not be determined."
fi
echo " * ${BIN##*/}$VARIANT: ${BIN} : ${VERSION_STR} "
}
echo "** Platform:"
echo
if [ `uname -s` = "Linux" ]; then
echo "Linux variant"
lsb_release -d -c
else
echo "Unknown Unix variant"
fi
echo
print_version "uname" "-a" ""
echo
echo
echo "** Tool Versions:"
echo
print_version "make" "--version" "" "head -n 1"
echo
print_version "cmake" "--version" "" "head -n 1"
echo
if [ "${RUN_ARMCC:-1}" -ne 0 ]; then
: "${ARMC5_CC:=armcc}"
print_version "$ARMC5_CC" "--vsn" "" "head -n 2"
echo
: "${ARMC6_CC:=armclang}"
print_version "$ARMC6_CC" "--vsn" "" "head -n 2"
echo
fi
print_version "arm-none-eabi-gcc" "--version" "" "head -n 1"
echo
print_version "gcc" "--version" "" "head -n 1"
echo
if [ -n "${GCC_EARLIEST+set}" ]; then
print_version "${GCC_EARLIEST}" "--version" "" "head -n 1"
else
echo " GCC_EARLIEST : Not configured."
fi
echo
if [ -n "${GCC_LATEST+set}" ]; then
print_version "${GCC_LATEST}" "--version" "" "head -n 1"
else
echo " GCC_LATEST : Not configured."
fi
echo
print_version "clang" "--version" "" "head -n 2"
echo
if [ -n "${CLANG_EARLIEST+set}" ]; then
print_version "${CLANG_EARLIEST}" "--version" "" "head -n 2"
else
echo " CLANG_EARLIEST : Not configured."
fi
echo
if [ -n "${CLANG_LATEST+set}" ]; then
print_version "${CLANG_LATEST}" "--version" "" "head -n 2"
else
echo " CLANG_LATEST : Not configured."
fi
echo
print_version "ldd" "--version" "" "head -n 1"
echo
print_version "valgrind" "--version" ""
echo
print_version "gdb" "--version" "" "head -n 1"
echo
print_version "perl" "--version" "" "head -n 2" "grep ."
echo
print_version "python" "--version" "" "head -n 1"
echo
print_version "python3" "--version" "" "head -n 1"
echo
# Find the installed version of Pylint. Installed as a distro package this can
# be pylint3 and as a PEP egg, pylint. In test scripts We prefer pylint over
# pylint3
if type pylint >/dev/null 2>/dev/null; then
print_version "pylint" "--version" "" "sed /^.*config/d" "grep pylint"
elif type pylint3 >/dev/null 2>/dev/null; then
print_version "pylint3" "--version" "" "sed /^.*config/d" "grep pylint"
else
echo " * pylint or pylint3: Not found."
fi
echo
: ${OPENSSL:=openssl}
print_version "$OPENSSL" "version" "default"
echo
if [ -n "${OPENSSL_NEXT+set}" ]; then
print_version "$OPENSSL_NEXT" "version" "next"
else
echo " * openssl (next): Not configured."
fi
echo
: ${GNUTLS_CLI:=gnutls-cli}
print_version "$GNUTLS_CLI" "--version" "default" "head -n 1"
echo
: ${GNUTLS_SERV:=gnutls-serv}
print_version "$GNUTLS_SERV" "--version" "default" "head -n 1"
echo
echo " * Installed asan versions:"
if type dpkg-query >/dev/null 2>/dev/null; then
if ! dpkg-query -f '${Status} ${Package}: ${Version}\n' -W 'libasan*' |
awk '$3 == "installed" && $4 !~ /-/ {print $4, $5}' |
grep .
then
echo " No asan versions installed."
fi
else
echo " Unable to determine the asan version without dpkg."
fi
echo

70
externals/mbedtls/scripts/prepare_release.sh vendored Executable file
View file

@ -0,0 +1,70 @@
#!/bin/bash
print_usage()
{
cat <<EOF
Usage: $0 [OPTION]...
Prepare the source tree for a release.
Options:
-u Prepare for development (undo the release preparation)
EOF
}
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
set -eu
if [ $# -ne 0 ] && [ "$1" = "--help" ]; then
print_usage
exit
fi
unrelease= # if non-empty, we're in undo-release mode
while getopts u OPTLET; do
case $OPTLET in
u) unrelease=1;;
\?)
echo 1>&2 "$0: unknown option: -$OPTLET"
echo 1>&2 "Try '$0 --help' for more information."
exit 3;;
esac
done
#### .gitignore processing ####
GITIGNORES=$(find . -name ".gitignore")
for GITIGNORE in $GITIGNORES; do
if [ -n "$unrelease" ]; then
sed -i '/###START_COMMENTED_GENERATED_FILES###/,/###END_COMMENTED_GENERATED_FILES###/s/^#//' $GITIGNORE
sed -i 's/###START_COMMENTED_GENERATED_FILES###/###START_GENERATED_FILES###/' $GITIGNORE
sed -i 's/###END_COMMENTED_GENERATED_FILES###/###END_GENERATED_FILES###/' $GITIGNORE
else
sed -i '/###START_GENERATED_FILES###/,/###END_GENERATED_FILES###/s/^/#/' $GITIGNORE
sed -i 's/###START_GENERATED_FILES###/###START_COMMENTED_GENERATED_FILES###/' $GITIGNORE
sed -i 's/###END_GENERATED_FILES###/###END_COMMENTED_GENERATED_FILES###/' $GITIGNORE
fi
done
#### Build scripts ####
# GEN_FILES defaults on (non-empty) in development, off (empty) in releases
if [ -n "$unrelease" ]; then
r=' yes'
else
r=''
fi
sed -i 's/^\(GEN_FILES[ ?:]*=\)\([^#]*\)/\1'"$r/" Makefile */Makefile
# GEN_FILES defaults on in development, off in releases
if [ -n "$unrelease" ]; then
r='ON'
else
r='OFF'
fi
sed -i '/[Oo][Ff][Ff] in development/! s/^\( *option *( *GEN_FILES *"[^"]*" *\)\([A-Za-z0-9][A-Za-z0-9]*\)/\1'"$r/" CMakeLists.txt

View file

@ -0,0 +1,47 @@
#!/bin/bash
# Temporarily (de)ignore Makefiles generated by CMake to allow easier
# git development
#
# Copyright The Mbed TLS Contributors
# SPDX-License-Identifier: Apache-2.0 OR GPL-2.0-or-later
IGNORE=""
# Parse arguments
#
until [ -z "$1" ]
do
case "$1" in
-u|--undo)
IGNORE="0"
;;
-v|--verbose)
# Be verbose
VERBOSE="1"
;;
-h|--help)
# print help
echo "Usage: $0"
echo -e " -h|--help\t\tPrint this help."
echo -e " -u|--undo\t\tRemove ignores and continue tracking."
echo -e " -v|--verbose\t\tVerbose."
exit 1
;;
*)
# print error
echo "Unknown argument: '$1'"
exit 1
;;
esac
shift
done
if [ "X" = "X$IGNORE" ];
then
[ $VERBOSE ] && echo "Ignoring Makefiles"
git update-index --assume-unchanged Makefile library/Makefile programs/Makefile tests/Makefile
else
[ $VERBOSE ] && echo "Tracking Makefiles"
git update-index --no-assume-unchanged Makefile library/Makefile programs/Makefile tests/Makefile
fi

View file

@ -0,0 +1,20 @@
@rem Build and test Mbed TLS with Visual Studio using msbuild.
@rem Usage: windows_msbuild [RETARGET]
@rem RETARGET: version of Visual Studio to emulate
@rem https://docs.microsoft.com/en-us/cpp/build/how-to-modify-the-target-framework-and-platform-toolset
@rem These parameters are hard-coded for now.
set "arch=x64" & @rem "x86" or "x64"
set "cfg=Release" & @rem "Debug" or "Release"
set "vcvarsall=C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Auxiliary\Build\vcvarsall.bat"
if not "%~1"=="" set "retarget=,PlatformToolset=%1"
@rem If the %USERPROFILE%\Source directory exists, then running
@rem vcvarsall.bat will silently change the directory to that directory.
@rem Setting the VSCMD_START_DIR environment variable causes it to change
@rem to that directory instead.
set "VSCMD_START_DIR=%~dp0\..\visualc\VS2013"
"%vcvarsall%" x64 && ^
msbuild /t:Rebuild /p:Configuration=%cfg%%retarget% /m mbedTLS.sln