summaryrefslogtreecommitdiffstats
path: root/src
diff options
context:
space:
mode:
authorSteven Knight <knight@baldmt.com>2008-02-04 19:07:24 (GMT)
committerSteven Knight <knight@baldmt.com>2008-02-04 19:07:24 (GMT)
commitbe25024e65a30e65a9e3799ffa5323e23f49003d (patch)
tree439714b7733dd4ffe858b1fc15e15fa3e036048d /src
parent0abfc01296f0184fb6c997400b92cfa7e48d81ef (diff)
downloadSCons-be25024e65a30e65a9e3799ffa5323e23f49003d.zip
SCons-be25024e65a30e65a9e3799ffa5323e23f49003d.tar.gz
SCons-be25024e65a30e65a9e3799ffa5323e23f49003d.tar.bz2
Merged revisions 2527-2645 via svnmerge from
http://scons.tigris.org/svn/scons/branches/core ........ r2528 | stevenknight | 2007-12-13 06:08:21 -0800 (Thu, 13 Dec 2007) | 5 lines Remove the .del_binfo() method, no longer needed since the Big Signature Refactoring causes us to visit every Node in order during the DAG walk, and the BuildInfo object now just holds information for storage in the .sconsign file. ........ r2529 | stevenknight | 2007-12-13 13:17:15 -0800 (Thu, 13 Dec 2007) | 3 lines Fix the --keep-going flag so it builds all possible targets even when a later top-level target depends on a child that failed its build. ........ r2530 | stevenknight | 2007-12-14 04:02:05 -0800 (Fri, 14 Dec 2007) | 4 lines Issue 1715: BuildDir(duplicate=0) support for Tex/LaTeX. Re-run LaTeX in response to package warnings. (Rob Managan) ........ r2531 | stevenknight | 2007-12-14 07:14:31 -0800 (Fri, 14 Dec 2007) | 3 lines Refactor the max_drift logic around fetching stored signatures into its own new method. ........ r2532 | stevenknight | 2007-12-14 07:18:44 -0800 (Fri, 14 Dec 2007) | 3 lines Have get_csig() return the stored content signature if max_drift says it's okay. ........ r2533 | stevenknight | 2007-12-14 18:34:51 -0800 (Fri, 14 Dec 2007) | 2 lines Issue 1859: Support SWIG statements like %module(directors="1"). ........ r2534 | stevenknight | 2007-12-15 03:51:13 -0800 (Sat, 15 Dec 2007) | 3 lines Python 2.1 portability fix w.r.t. "import SCons" and "import SCons.platform.win32" and binding local variables and whatnot. ........ r2535 | stevenknight | 2007-12-15 03:51:56 -0800 (Sat, 15 Dec 2007) | 2 lines Python 1.5 fix: use the -classic flag when invoking SWIG. ........ r2536 | stevenknight | 2007-12-15 06:03:48 -0800 (Sat, 15 Dec 2007) | 4 lines Support subclasses of the new-style str() class as input to Builders and the like. Also speed up all of the Util.is_*() functions when using new-style classes by just using isinstance() internally. ........ r2537 | stevenknight | 2007-12-15 06:35:49 -0800 (Sat, 15 Dec 2007) | 3 lines Issue 1851: Fix being able to use $PDB and $WINDOWS_INSERT_MANIFEST together. (Benoit Belley) ........ r2538 | stevenknight | 2007-12-15 06:59:43 -0800 (Sat, 15 Dec 2007) | 3 lines Handle dangling entries for the Intel C compiler in the Windows registry. (Benoit Belley) ........ r2539 | stevenknight | 2007-12-15 09:51:59 -0800 (Sat, 15 Dec 2007) | 2 lines Reorganize library-related tests into a separate subdirectory. ........ r2540 | stevenknight | 2007-12-15 09:57:29 -0800 (Sat, 15 Dec 2007) | 4 lines Issue 1850: better support for non-standard shared library prefixes and suffixes by stripping all prefixes and suffixes in lists of $SHLIBPREFIXES and $SHLIBSUFFIXES. (Benoit Belley) ........ r2541 | stevenknight | 2007-12-15 18:49:15 -0800 (Sat, 15 Dec 2007) | 2 lines Python 1.5 portability fixes. ........ r2542 | stevenknight | 2007-12-15 19:02:39 -0800 (Sat, 15 Dec 2007) | 3 lines Issue 1768: Have the D language scanner search for .di files as well as .d files. (Jerome Berger) ........ r2543 | stevenknight | 2007-12-16 14:31:40 -0800 (Sun, 16 Dec 2007) | 3 lines Add a find_include_names() method to the Scanner.Classic class to abstract out how included names can be generated by subclasses. (Jerome Berger) ........ r2544 | stevenknight | 2007-12-16 14:31:54 -0800 (Sun, 16 Dec 2007) | 3 lines Add a find_include_names() method to the Scanner.Classic class to abstract out how included names can be generated by subclasses. (Jerome Berger) ........ r2545 | stevenknight | 2007-12-16 15:04:43 -0800 (Sun, 16 Dec 2007) | 3 lines Issue 1860: Support the D scanner returning multiple modules from a single import statement. (Jerome Berger) ........ r2546 | stevenknight | 2007-12-16 17:41:17 -0800 (Sun, 16 Dec 2007) | 3 lines Issue 1861: Fix the ability to #include a file (or search other $*PATH variables) that has an absoluate path. ........ r2547 | stevenknight | 2007-12-18 08:09:59 -0800 (Tue, 18 Dec 2007) | 2 lines Replace uses of "is_List() or is_Tuple()" with is_Sequence(). ........ r2548 | stevenknight | 2007-12-18 08:13:14 -0800 (Tue, 18 Dec 2007) | 2 lines Report the incorrect value in assertions. ........ r2549 | stevenknight | 2007-12-19 07:58:56 -0800 (Wed, 19 Dec 2007) | 3 lines Fix handling #includes of absolute path names when the path doesn't exist (implicitly, because it's #ifdef'ed out). ........ r2550 | stevenknight | 2007-12-19 08:29:24 -0800 (Wed, 19 Dec 2007) | 4 lines Fix test path examination when the temporary directory location is redirected via symlinks (e.g. /usr/tmp -> /var/tmp on Red Hat). (Benoit Belley) ........ r2551 | stevenknight | 2007-12-19 08:30:17 -0800 (Wed, 19 Dec 2007) | 2 lines Fix scons-time path reporting when symlinks are involved. (Benoit Belley) ........ r2552 | stevenknight | 2007-12-19 22:51:18 -0800 (Wed, 19 Dec 2007) | 4 lines Issue 1855: Reduce the worker thread stack size to a default of 256 Kbytes. Add a --stack-size= command-line option, also configurable via SetOption('stack_size'). (Benoit Belley) ........ r2553 | stevenknight | 2007-12-20 18:25:50 -0800 (Thu, 20 Dec 2007) | 2 lines Skip this test if SWIG isn't installed. ........ r2554 | stevenknight | 2007-12-20 18:26:21 -0800 (Thu, 20 Dec 2007) | 2 lines Accomodate slightly different permissions errors on Ubuntu Gutsy. ........ r2555 | stevenknight | 2007-12-21 02:12:09 -0800 (Fri, 21 Dec 2007) | 3 lines Fix a Python 2.2 quirk in the reported file name ("<string>") when encountering a SyntaxError in a SConstruct file. ........ r2556 | stevenknight | 2007-12-21 02:12:35 -0800 (Fri, 21 Dec 2007) | 2 lines Enforce order between the build of f1.out and f2.out. ........ r2557 | stevenknight | 2007-12-21 02:12:55 -0800 (Fri, 21 Dec 2007) | 2 lines Don't die if the Python version doesn't have os.path.realpath(). ........ r2558 | stevenknight | 2007-12-21 02:13:19 -0800 (Fri, 21 Dec 2007) | 2 lines Refactor the test/build-errors.py script into separate scripts for each test. ........ r2559 | stevenknight | 2007-12-21 08:08:12 -0800 (Fri, 21 Dec 2007) | 3 lines Issue 1864: Add a CheckDeclaration() call to configure contexts. (David Cournapeau) ........ r2560 | stevenknight | 2007-12-21 08:18:47 -0800 (Fri, 21 Dec 2007) | 2 lines Issue 1865: Improve the CheckTypeSize() code. (David Cournapeau) ........ r2561 | stevenknight | 2007-12-21 08:21:47 -0800 (Fri, 21 Dec 2007) | 2 lines Fix os.path.realpath() handling (a Python 2.1 portability issue). ........ r2562 | stevenknight | 2007-12-21 14:08:39 -0800 (Fri, 21 Dec 2007) | 2 lines Split CPPDEFINES.py into separate sub-test scripts. ........ r2563 | stevenknight | 2007-12-21 15:56:26 -0800 (Fri, 21 Dec 2007) | 6 lines Support proper expansion of construction variables containing lists within expansions like $CPPPATH. Change env.subst() to return a list, not a joined string, when the input is a list. (Konstantin Bozhikov) ........ r2564 | stevenknight | 2007-12-22 04:15:11 -0800 (Sat, 22 Dec 2007) | 2 lines Normalize the ModDate field when comparing generated PDF files. ........ r2565 | stevenknight | 2007-12-22 22:01:45 -0800 (Sat, 22 Dec 2007) | 5 lines Java test refactoring to commonize construction environment initialization and searching for javac / javah / jar / rmic. Don't look for *_Skel.class files to be created by Java 1.[56]. Minor Java code changes to deal with compiler warnings. ........ r2566 | stevenknight | 2007-12-23 05:20:45 -0800 (Sun, 23 Dec 2007) | 2 lines Don't still look for *_Skel.class files. ........ r2567 | stevenknight | 2007-12-23 07:30:36 -0800 (Sun, 23 Dec 2007) | 5 lines Fix Intel C compiler issues: Issue 1863: Fix failure to match /opt/intel_cc_* directories. (Benoit Belley) Issue 1866: Fix topdir when the version isn't specified. (Jonas Olsson) Issue 1867: Fix use of network licenses. (Jonas Olsson) ........ r2573 | stevenknight | 2008-01-01 09:59:16 -0800 (Tue, 01 Jan 2008) | 3 lines Add asynchronous subprocess communication via new start() and finish() methods. ........ r2574 | stevenknight | 2008-01-01 10:02:26 -0800 (Tue, 01 Jan 2008) | 4 lines Minor code cleanup: attach the version string to the options parser object, instead of passing it in to deal with the lack of nested scopes in Python 1.5.2. ........ r2575 | stevenknight | 2008-01-01 10:08:46 -0800 (Tue, 01 Jan 2008) | 4 lines Rename the CacheDir class and let the name CacheDir be a variable that can be reset at will, depending on whether CacheDir() support is enabled or disabled at any particular time. ........ r2576 | stevenknight | 2008-01-01 10:14:58 -0800 (Tue, 01 Jan 2008) | 2 lines Restore the Node.del_binfo() method and its call in Node.clear(). ........ r2577 | stevenknight | 2008-01-02 07:51:25 -0800 (Wed, 02 Jan 2008) | 6 lines Refactor CacheDir support (again) for --interactive mode. Delay effects of --cache-* settings until they're needed by getting rid of the Null() object pattern and the functional programming idiom of replacing the CacheDebug method. Have the Environment.CacheDir() method just record the path for later instantiation. ........ r2578 | stevenknight | 2008-01-02 18:48:12 -0800 (Wed, 02 Jan 2008) | 3 lines Issue 1657: Add a --interactive option to create a command-line interpreter for re-building targets without re-reading SConscript files. ........ r2579 | stevenknight | 2008-01-02 21:54:38 -0800 (Wed, 02 Jan 2008) | 2 lines Python 1.5.2 portability fix (no use of +=). ........ r2580 | stevenknight | 2008-01-02 21:54:47 -0800 (Wed, 02 Jan 2008) | 3 lines Use a regular expression to avoid having to match a specific MD5 checksum value in the --cache-debug output. ........ r2581 | stevenknight | 2008-01-02 21:54:59 -0800 (Wed, 02 Jan 2008) | 4 lines Don't bother looking for shlex.split(), since our compatibility layer provides it in older Python version. Make the compatibility version of shlex.split() not treat '.' as a token separator. ........ r2582 | stevenknight | 2008-01-02 21:56:15 -0800 (Wed, 02 Jan 2008) | 3 lines Python 1.5.2 portability fixes: no list comprehensions, no nested scopes, no "for x in" a dictionary. ........ r2583 | stevenknight | 2008-01-03 07:39:59 -0800 (Thu, 03 Jan 2008) | 3 lines Fix a left-over use of a string method. Fix printing --interactive help text, which I outright broke last checkin. ........ r2584 | stevenknight | 2008-01-03 07:58:56 -0800 (Thu, 03 Jan 2008) | 4 lines Import the vanilla Python2.5 shlex module, which we'll use as a basis for retrofitting to old Python versions to provide shlex.split() functionality. ........ r2585 | stevenknight | 2008-01-03 08:01:02 -0800 (Thu, 03 Jan 2008) | 3 lines Modifications to the vanilla Python 2.5 shlex module to make it work back to Python 1.5. ........ r2586 | stevenknight | 2008-01-03 08:04:31 -0800 (Thu, 03 Jan 2008) | 3 lines Use the new shlex compatibility module if we're using an old version of Python with a native shlex module that has no shlex.split() function. ........ r2587 | stevenknight | 2008-01-03 09:31:15 -0800 (Thu, 03 Jan 2008) | 3 lines Fix the ParseFlags() unit test now that we have a real shlex.split() function even on earlier Python versions. ........ r2588 | stevenknight | 2008-01-06 04:52:05 -0800 (Sun, 06 Jan 2008) | 3 lines Add compat/_scons_shlex.py to exception lists for __copyright__ and __revision__ strings. ........ r2589 | stevenknight | 2008-01-06 06:32:07 -0800 (Sun, 06 Jan 2008) | 2 lines Remove leftover debug print. ........ r2590 | stevenknight | 2008-01-06 07:35:46 -0800 (Sun, 06 Jan 2008) | 3 lines Change the test to work by wrapping the public .__call__() method of the C scanner, instead of the internal .scan() method. ........ r2591 | stevenknight | 2008-01-06 07:39:12 -0800 (Sun, 06 Jan 2008) | 3 lines Use the public CScan.path() method, not the internal CScan.path_function attribute. ........ r2592 | stevenknight | 2008-01-07 02:55:53 -0800 (Mon, 07 Jan 2008) | 2 lines Use a tuple instead of a list for the cpp module path(s). ........ r2593 | stevenknight | 2008-01-07 03:10:28 -0800 (Mon, 07 Jan 2008) | 2 lines Don't die if a macro function expands to a non-string (an integer). ........ r2594 | stevenknight | 2008-01-07 03:29:12 -0800 (Mon, 07 Jan 2008) | 3 lines Python 1.5 throws TypeError, not AttributeError if you try to string.split() a non-string value. ........ r2595 | stevenknight | 2008-01-07 03:30:18 -0800 (Mon, 07 Jan 2008) | 3 lines Reduce duplicate execution of individual test_*() unit test methods by eliminating duplicates (if the set() type is avaiable). ........ r2596 | stevenknight | 2008-01-07 06:57:30 -0800 (Mon, 07 Jan 2008) | 6 lines Add a basic test of in-line #include handling. Sort the test names. Don't os.path.join() the directory name if we find the file in the current directory. Use os.curdir instead of hard-coding '.' as the current directory. ........ r2597 | stevenknight | 2008-01-07 06:59:29 -0800 (Mon, 07 Jan 2008) | 3 lines Read files with a new .read_file() method, so it can be overridden by subclasses. ........ r2598 | stevenknight | 2008-01-07 17:59:50 -0800 (Mon, 07 Jan 2008) | 6 lines Record the name of the file currently being processed. Make the public API (the .__call__() method) passing in a file name to be opened, and have it call a new, separate .process_contents() method (the old .__call__() method) for handling in-memory strings. ........ r2599 | stevenknight | 2008-01-07 20:03:18 -0800 (Mon, 07 Jan 2008) | 3 lines Make the test failure informative when we don't find the includes we expect by printing the expected string and actual output. ........ r2600 | stevenknight | 2008-01-07 20:24:21 -0800 (Mon, 07 Jan 2008) | 2 lines Handle no white space after #include (e.g. #include<foo.h>). ........ r2601 | stevenknight | 2008-01-07 21:01:27 -0800 (Mon, 07 Jan 2008) | 4 lines Fixes for older Python versions: No tempfile.mktemp(prefix=) argument. No string methods. ........ r2602 | stevenknight | 2008-01-08 20:57:30 -0800 (Tue, 08 Jan 2008) | 3 lines Fix command-line editing of --interactive mode with the readline module by only changing sys.stdout to our Unbuffered class if it isn't a tty. ........ r2603 | stevenknight | 2008-01-08 22:12:20 -0800 (Tue, 08 Jan 2008) | 4 lines Fix the --interactive "build" command with no targets: build the specified Default() targets; issue an error message but don't exit if Default(None) is explicity specified. ........ r2604 | stevenknight | 2008-01-09 05:00:36 -0800 (Wed, 09 Jan 2008) | 9 lines Improve Python functions used as actions by incorporating into their build signatures: - literal values referenced by the byte code. - values of default arguments - code of nested functions - values of variables captured by closures - names of referenced global variables and functions (Benoit Belley) ........ r2605 | stevenknight | 2008-01-09 06:39:03 -0800 (Wed, 09 Jan 2008) | 4 lines Add a Configure.Define() method for adding arbitrary #define lines to generated configure header files. (David Cournapeau) ........ r2606 | stevenknight | 2008-01-09 07:33:21 -0800 (Wed, 09 Jan 2008) | 4 lines Issue 1858: Fix the closing message when --clean and --keep-going are both used so it only reports errors if some actually occurred. (Benoit Belley) ........ r2607 | stevenknight | 2008-01-09 07:51:55 -0800 (Wed, 09 Jan 2008) | 3 lines Issue 1843: Add a gfortran Tool module for the GNU F95/F2003 compiler. (David Cournapeau) ........ r2608 | stevenknight | 2008-01-09 09:31:15 -0800 (Wed, 09 Jan 2008) | 4 lines Issue 1733: If $JARCHDIR isn't set explicitly, use the .java_classdir attribute that was set when the Java() Builder built the .class files. (Jan Nijtmans) ........ r2609 | stevenknight | 2008-01-09 11:27:28 -0800 (Wed, 09 Jan 2008) | 4 lines Allow Scanner.FindPathDirs objects to not take a dir= keyword argument when called. (The code already detects that and uses the current directory if necessary.) ........ r2610 | stevenknight | 2008-01-09 12:23:26 -0800 (Wed, 09 Jan 2008) | 3 lines Allow subclass overrides of results-handling by the addition of new initialize_result() and finalize_result() methods. ........ r2611 | stevenknight | 2008-01-09 14:49:50 -0800 (Wed, 09 Jan 2008) | 6 lines Capture new C Scanner glue code that knows how to use $CPPDEFINES to evaluate CPP #if/#ifdef/#elif/#else lines. Currently disabled (including the test script that validates the behavior) while we look for the right way to let users configure the feature, and work on performance issues with its O(N*M) algorithm. ........ r2612 | stevenknight | 2008-01-24 20:42:57 -0800 (Thu, 24 Jan 2008) | 3 lines Fix regular expression comparisons on Windows by escaping the \ path separators. ........ r2613 | stevenknight | 2008-01-24 20:49:04 -0800 (Thu, 24 Jan 2008) | 3 lines Rename a created stub script from "cmd.py" so it doesn't mistakenly get imported by the "import cmd" statement in Script/Interactive.py. ........ r2614 | stevenknight | 2008-01-24 20:56:05 -0800 (Thu, 24 Jan 2008) | 4 lines Fix a race condition between the actions executed by the worker threads by having the dependent action print its own execution line, and telling SCons to treat it silently (strfunction=None). ........ r2615 | stevenknight | 2008-01-24 20:59:03 -0800 (Thu, 24 Jan 2008) | 2 lines Remove left-over commented-out lines. ........ r2616 | stevenknight | 2008-01-24 21:59:49 -0800 (Thu, 24 Jan 2008) | 7 lines Windows portability in --interactive mode and its tests: Quote target names that may have spaces in them. Use the .exe suffix on a generated executable. Use the subprocess .wait() method to get the subprocess exit status when shelling out on Windows. Use an Unbuffered object for stderr (when it's not a tty). ........ r2617 | stevenknight | 2008-01-24 22:14:49 -0800 (Thu, 24 Jan 2008) | 3 lines Issue 1886: Fix the ability to build Aliases in --interactive mode. (Gary Oberbrunner) ........ r2618 | stevenknight | 2008-01-24 22:33:29 -0800 (Thu, 24 Jan 2008) | 3 lines Issue 1886: Handle Python versions that throw TypeError when they can't pickle a nested function. (Gary Oberbrunner) ........ r2619 | stevenknight | 2008-01-24 22:38:44 -0800 (Thu, 24 Jan 2008) | 3 lines Fix the LoadableModule.py test when run on Intel Macs (look for the string i386 in the file output, in addition to ppc). ........ r2620 | stevenknight | 2008-01-25 06:50:43 -0800 (Fri, 25 Jan 2008) | 4 lines Issue 1892: use "link" instead of "gnulink" for the Mac tool chain, since it doesn't understand the -rpath option and can't use $RPATH. (David Cournapeau) ........ r2621 | stevenknight | 2008-01-25 07:51:56 -0800 (Fri, 25 Jan 2008) | 2 lines Issue 1893: add Intel C compiler support on Mac OS X. (Benoit Belley) ........ r2622 | stevenknight | 2008-01-25 21:48:16 -0800 (Fri, 25 Jan 2008) | 2 lines Fix how we handle falling back to timestamps when no md5.py module exists. ........ r2623 | stevenknight | 2008-01-26 16:55:56 -0800 (Sat, 26 Jan 2008) | 5 lines Work around a metaclass / new.instancemethod() bug in base Python 2.2 by disallowing --debug=memoizer functionality if Python can't handle the Memoizer initialization (much like we do for earlier Python versions that don't have metaclasses at all). ........ r2624 | stevenknight | 2008-01-26 18:22:14 -0800 (Sat, 26 Jan 2008) | 4 lines Fix CacheDir by simplifying how the NullEnvironment hands back something that looks enough like a CacheDir object that the rest of the code doesn't require special handling. ........ r2625 | stevenknight | 2008-01-26 20:56:17 -0800 (Sat, 26 Jan 2008) | 2 lines Have the "scons-time time" subcommand handle empty files gracefully. ........ r2626 | stevenknight | 2008-01-26 20:57:21 -0800 (Sat, 26 Jan 2008) | 3 lines Add a Trace() statement to the Node.changed() method if the dependency lists are different lengths. ........ r2627 | stevenknight | 2008-01-26 21:30:59 -0800 (Sat, 26 Jan 2008) | 3 lines Have the "scons-time time --which" subcommand handle files that don't contain the requested results ........ r2628 | stevenknight | 2008-01-26 21:52:51 -0800 (Sat, 26 Jan 2008) | 2 lines Fix the ability to draw vertical bars with --fmt gnuplot option. ........ r2629 | stevenknight | 2008-01-26 22:23:10 -0800 (Sat, 26 Jan 2008) | 3 lines Allow "scons-time run" to copy non-archive files for timing. Document the archive_list config file variable. ........ r2630 | stevenknight | 2008-01-27 10:38:11 -0800 (Sun, 27 Jan 2008) | 3 lines Use the maximum Y value, not the maximum X value, as the top Y endpoint of a vertical bar drawn with --fmt=gnuplot. ........ r2631 | stevenknight | 2008-01-27 12:05:40 -0800 (Sun, 27 Jan 2008) | 2 lines Make scons-time more robust when handling log files that have no results. ........ r2632 | stevenknight | 2008-01-27 12:49:02 -0800 (Sun, 27 Jan 2008) | 2 lines Rotate label positions so they don't overwrite each other. ........ r2633 | stevenknight | 2008-01-27 16:21:17 -0800 (Sun, 27 Jan 2008) | 2 lines Extend vertical bars to graph top, not maximum X value. ........ r2634 | stevenknight | 2008-01-27 18:08:19 -0800 (Sun, 27 Jan 2008) | 2 lines Capture three configurations for timing various aspects of SCons. ........ r2635 | stevenknight | 2008-01-28 04:55:12 -0800 (Mon, 28 Jan 2008) | 2 lines Fix jar calls to use "tf" instead of "-t -f" for compatibility with Sun. ........ r2636 | stevenknight | 2008-01-28 12:49:58 -0800 (Mon, 28 Jan 2008) | 6 lines Refactor cut-and-paste tempdir_re() function into a common method in QMTest/TestSCons_time.py. In the refactored code, fix typo of os.path.relpath() where we meant os.path.realpath(), so we follow the /tmp -> /private/tmp symlink on Mac OS X. ........ r2637 | stevenknight | 2008-01-28 15:18:14 -0800 (Mon, 28 Jan 2008) | 5 lines Apple portability in the test for explicit "No such file" error messages from trying to fork()/exec() a non-existent file name. Refactor the tests for (non-)expected output in stderr so they're informative if they fail. ........ r2638 | stevenknight | 2008-01-28 17:54:29 -0800 (Mon, 28 Jan 2008) | 3 lines Make the test output deterministic by making the InstallAs() targets (file[23].out) depend on the Install() target (file1.out). ........ r2639 | stevenknight | 2008-01-28 21:37:38 -0800 (Mon, 28 Jan 2008) | 4 lines On Mac OS X, add -w to LINKFLAGS to suppress warnings about the directories we specify as -L arguments which don't actually exist. We just want to make sure that the right directory names show up. ........ r2640 | stevenknight | 2008-01-28 21:38:36 -0800 (Mon, 28 Jan 2008) | 3 lines On Mac OS X, the generated include file for C++ just tacks ".h" on the end of the generated .cpp file name. Define $YACCHXXFILESUFFIX accordingly. ........ r2641 | stevenknight | 2008-01-29 04:56:18 -0800 (Tue, 29 Jan 2008) | 3 lines Add the src/CHANGES.txt for the previous change (Mac OS X bison behavior). Add a "bison" application entity to the DocBook infrastructure. ........ r2642 | stevenknight | 2008-01-30 05:09:02 -0800 (Wed, 30 Jan 2008) | 5 lines Improve QT tests for Mac OS X: More general regular expression match for a "Generated moc file" warning. Copy libmyqt.dylib to the same directory as the "aaa" executable so it's found when we run it. ........ r2643 | stevenknight | 2008-01-30 05:19:23 -0800 (Wed, 30 Jan 2008) | 2 lines Skip the test of Java handling SWIG dependencies if swig isn't installed. ........ r2644 | stevenknight | 2008-01-30 06:44:30 -0800 (Wed, 30 Jan 2008) | 2 lines Remove left-over print statement. ........ r2645 | stevenknight | 2008-01-30 06:52:54 -0800 (Wed, 30 Jan 2008) | 2 lines Mac OS X fix: use .dylib, not .so, in the list of "weird suffixes" we test. ........
Diffstat (limited to 'src')
-rw-r--r--src/CHANGES.txt135
-rw-r--r--src/RELEASE.txt37
-rw-r--r--src/engine/MANIFEST.in3
-rw-r--r--src/engine/SCons/Action.py185
-rw-r--r--src/engine/SCons/ActionTests.py21
-rw-r--r--src/engine/SCons/CacheDir.py44
-rw-r--r--src/engine/SCons/Conftest.py78
-rw-r--r--src/engine/SCons/Defaults.py36
-rw-r--r--src/engine/SCons/Environment.py24
-rw-r--r--src/engine/SCons/EnvironmentTests.py54
-rw-r--r--src/engine/SCons/Executor.py10
-rw-r--r--src/engine/SCons/Job.py50
-rw-r--r--src/engine/SCons/JobTests.py2
-rw-r--r--src/engine/SCons/Memoize.py42
-rw-r--r--src/engine/SCons/MemoizeTests.py4
-rw-r--r--src/engine/SCons/Node/FS.py80
-rw-r--r--src/engine/SCons/Node/NodeTests.py14
-rw-r--r--src/engine/SCons/Node/__init__.py16
-rw-r--r--src/engine/SCons/PathList.py9
-rw-r--r--src/engine/SCons/Platform/posix.py2
-rw-r--r--src/engine/SCons/SConf.py41
-rw-r--r--src/engine/SCons/SConfTests.py55
-rw-r--r--src/engine/SCons/Scanner/C.py84
-rw-r--r--src/engine/SCons/Scanner/D.py23
-rw-r--r--src/engine/SCons/Scanner/LaTeX.py2
-rw-r--r--src/engine/SCons/Scanner/ScannerTests.py3
-rw-r--r--src/engine/SCons/Scanner/__init__.py7
-rw-r--r--src/engine/SCons/Script/Interactive.py359
-rw-r--r--src/engine/SCons/Script/Main.py104
-rw-r--r--src/engine/SCons/Script/SConsOptions.py23
-rw-r--r--src/engine/SCons/Subst.py66
-rw-r--r--src/engine/SCons/SubstTests.py26
-rw-r--r--src/engine/SCons/Taskmaster.py44
-rw-r--r--src/engine/SCons/Tool/applelink.py6
-rw-r--r--src/engine/SCons/Tool/gfortran.py62
-rw-r--r--src/engine/SCons/Tool/gfortran.xml15
-rw-r--r--src/engine/SCons/Tool/intelc.py77
-rw-r--r--src/engine/SCons/Tool/jar.py25
-rw-r--r--src/engine/SCons/Tool/jar.xml5
-rw-r--r--src/engine/SCons/Tool/link.py9
-rw-r--r--src/engine/SCons/Tool/mslink.py48
-rw-r--r--src/engine/SCons/Tool/qt.py2
-rw-r--r--src/engine/SCons/Tool/rmic.py6
-rw-r--r--src/engine/SCons/Tool/swig.py3
-rw-r--r--src/engine/SCons/Tool/tex.py98
-rw-r--r--src/engine/SCons/Tool/yacc.py11
-rw-r--r--src/engine/SCons/Tool/yacc.xml8
-rw-r--r--src/engine/SCons/Util.py124
-rw-r--r--src/engine/SCons/UtilTests.py28
-rw-r--r--src/engine/SCons/Warnings.py3
-rw-r--r--src/engine/SCons/compat/__init__.py22
-rw-r--r--src/engine/SCons/compat/_scons_shlex.py319
-rw-r--r--src/engine/SCons/cpp.py73
-rw-r--r--src/engine/SCons/cppTests.py156
-rw-r--r--src/script/scons-time.py70
-rw-r--r--src/test_strings.py4
56 files changed, 2383 insertions, 474 deletions
diff --git a/src/CHANGES.txt b/src/CHANGES.txt
index c8dc7d3..81b54e6 100644
--- a/src/CHANGES.txt
+++ b/src/CHANGES.txt
@@ -8,6 +8,141 @@
+RELEASE 0.XX - XXX
+
+ From Benoit Belley:
+
+ - Fix the --keep-going flag so it builds all possible targets even when
+ a later top-level target depends on a child that failed its build.
+
+ - Fix being able to use $PDB and $WINDWOWS_INSERT_MANIFEST together.
+
+ - Don't crash if un-installing the Intel C compiler leaves left-over,
+ dangling entries in the Windows registry.
+
+ - Improve support for non-standard library prefixes and suffixes by
+ stripping all prefixes/suffixes from file name string as appropriate.
+
+ - Reduce the default stack size for -j worker threads to 256 Kbytes.
+ Provide user control over this value by adding --stack-size and
+ --warn=stack-size options, and a SetOption('stack_size') function.
+
+ - Fix a crash on Linux systems when trying to use the Intel C compiler
+ and no /opt/intel_cc_* directories are found.
+
+ - Improve using Python functions as actions by incorporating into
+ a FunctionAction's signature:
+ - literal values referenced by the byte code.
+ - values of default arguments
+ - code of nested functions
+ - values of variables captured by closures
+ - names of referenced global variables and functions
+
+ - Fix the closing message when --clean and --keep-going are both
+ used and no errors occur.
+
+ - Add support for the Intel C compiler on Mac OS X.
+
+ From Jérôme Berger:
+
+ - Have the D language scanner search for .di files as well as .d files.
+
+ - Add a find_include_names() method to the Scanner.Classic class to
+ abstract out how included names can be generated by subclasses.
+
+ - Allow the D language scanner to detect multiple modules imported by
+ a single statement.
+
+ From Konstantin Bozhikov:
+
+ - Support expansion of construction variables that contain or refer
+ to lists of other variables or Nodes within expansions like $PCPPATH.
+
+ - Change variable substitution (the env.subst() method) so that an
+ input sequence (list or tuple) is preserved as a list in the output.
+
+ From David Cournapeau:
+
+ - Add a CheckDeclaration() call to configure contexts.
+
+ - Improve the CheckTypeSize() code.
+
+ - Add a Define() call to configure contexts, to add arbitrary #define
+ lines to a generated configure header file.
+
+ - Add a "gfortran" Tool module for the GNU F95/F2003 compiler.
+
+ - Avoid use of -rpath with the Mac OS X linker.
+
+ From Steven Knight:
+
+ - Support the ability to subclass the new-style "str" class as input
+ to Builders.
+
+ - Improve the performance of our type-checking by using isinstance()
+ with new-style classes.
+
+ - Fix #include (and other $*PATH variables searches) of files with
+ absolute path names. Don't die if they don't exist (due to being
+ #ifdef'ed out or the like).
+
+ - Fix --interactive mode when Default(None) is used.
+
+ - Fix --debug=memoizer to work around a bug in base Python 2.2 metaclass
+ initialization (by just not allowing Memoization in Python versions
+ that have the bug).
+
+ - Have the "scons-time time" subcommand handle empty log files, and
+ log files that contain no results specified by the --which option.
+
+ - Fix the max Y of vertical bars drawn by "scons-time --fmt=gnuplot".
+
+ - On Mac OS X, account for the fact that the header file generated
+ from a C++ file will be named (e.g.) file.cpp.h, not file.hpp.
+
+ From Rob Managan:
+
+ - Enhance TeX and LaTeX support to work with BuildDir(duplicate=0).
+
+ - Re-run LaTeX when it issues a package warning that it must be re-run.
+
+ From Jan Nijtmans:
+
+ - If $JARCHDIR isn't set explicitly, use the .java_classdir attribute
+ that was set when the Java() Builder built the .class files.
+
+ From Gary Oberbrunner:
+
+ - Fix the ability to build an Alias in --interactive mode.
+
+ - Fix the ability to hash the contents of actions for nested Python
+ functions on Python versions where the inability to pickle them
+ returns a TypeError (instead of the documented PicklingError).
+
+ From Jonas Olsson:
+
+ - Fix use of the Intel C compiler when the top compiler directory,
+ but not the compiler version, is specified.
+
+ - Handle Intel C compiler network license files (port@system).
+
+ From Adam Simpkins:
+
+ - Add a --interactive option that starts a session for building (or
+ cleaning) targets without re-reading the SConscript files every time.
+
+ - Fix use of readline command-line editing in --interactive mode.
+
+ - Have the --interactive mode "build" command with no arguments
+ build the specified Default() targets.
+
+ From Ben Webb:
+
+ - Support the SWIG %module statement with following modifiers in
+ parenthese (e.g., '%module(directors="1")').
+
+
+
RELEASE 0.97.0d20071212 - Wed, 12 Dec 2007 09:29:32 -0600
From Benoit Belley:
diff --git a/src/RELEASE.txt b/src/RELEASE.txt
index ca01607..7327f21 100644
--- a/src/RELEASE.txt
+++ b/src/RELEASE.txt
@@ -25,6 +25,43 @@ RELEASE 0.97.0d20071212 - Wed, 12 Dec 2007 09:29:32 -0600
This is the eighth beta release of SCons. Please consult the
CHANGES.txt file for a list of specific changes since last release.
+ Please note the following important changes since release 0.97.0d20071212:
+
+ -- THE env.subst() METHOD NOW RETURNS A LIST WHEN THE INPUT IS A SEQUENCE
+
+ The env.subst() method now returns a list with the elements
+ expanded when given a list as input. Previously, the env.subst()
+ method would always turn its result into a string.
+
+ This behavior was changed because way it interfered with
+ being able to include things like lists within the expansion
+ of variables like $CPPPATH and have SCons understand that the
+ elements of the "internal" lists still needed to be treated
+ separately. This would show up as a list like ['subdir1',
+ 'subdir'] showing up in a command line as "-Isubdir1 subdir".
+
+ -- THE Jar() BUILDER NOW USES THE Java() BUILDER CLASSDIR BY DEFAULT
+
+ By default, the Jar() Builder will now use the class directory
+ specified when the Java() builder is called. So the following
+ input:
+
+ classes = env.Java('classes', 'src')
+ env.Jar('out.jar', classes)
+
+ Will cause "-C classes" to be passed the "jar" command invocation,
+ and the Java classes in the "out.jar" file will not be prefixed
+ "classes/".
+
+ Explicitly setting the $JARCHDIR variable overrides this default
+ behavior. The old behavior of not passing any -C option to the
+ "jar" command can be preserved by explicitly setting $JARCHDIR
+ to None:
+
+ env = Environment(JARCHDIR = None)
+
+ The above setting is compatible with older versions of SCons.
+
Please note the following important changes since release 0.97.0d20070918:
-- SCons REDEFINES PYTHON open() AND file() ON Windows TO NOT PASS
diff --git a/src/engine/MANIFEST.in b/src/engine/MANIFEST.in
index 52323d2..093fbd9 100644
--- a/src/engine/MANIFEST.in
+++ b/src/engine/MANIFEST.in
@@ -6,6 +6,7 @@ SCons/compat/_scons_hashlib.py
SCons/compat/_scons_optparse.py
SCons/compat/_scons_sets.py
SCons/compat/_scons_sets15.py
+SCons/compat/_scons_shlex.py
SCons/compat/_scons_subprocess.py
SCons/compat/_scons_textwrap.py
SCons/compat/_scons_UserString.py
@@ -54,6 +55,7 @@ SCons/Scanner/Prog.py
SCons/SConf.py
SCons/SConsign.py
SCons/Script/__init__.py
+SCons/Script/Interactive.py
SCons/Script/Main.py
SCons/Script/SConscript.py
SCons/Script/SConsOptions.py
@@ -89,6 +91,7 @@ SCons/Tool/g++.py
SCons/Tool/g77.py
SCons/Tool/gas.py
SCons/Tool/gcc.py
+SCons/Tool/gfortran.py
SCons/Tool/gnulink.py
SCons/Tool/gs.py
SCons/Tool/hpc++.py
diff --git a/src/engine/SCons/Action.py b/src/engine/SCons/Action.py
index c2c1158..cd4bf6a 100644
--- a/src/engine/SCons/Action.py
+++ b/src/engine/SCons/Action.py
@@ -97,6 +97,7 @@ way for wrapping up the functions.
__revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
+import cPickle
import dis
import os
import os.path
@@ -150,6 +151,138 @@ else:
i = i+1
return string.join(result, '')
+
+def _callable_contents(obj):
+ """Return the signature contents of a callable Python object.
+ """
+ try:
+ # Test if obj is a method.
+ return _function_contents(obj.im_func)
+
+ except AttributeError:
+ try:
+ # Test if obj is a callable object.
+ return _function_contents(obj.__call__.im_func)
+
+ except AttributeError:
+ try:
+ # Test if obj is a code object.
+ return _code_contents(obj)
+
+ except AttributeError:
+ # Test if obj is a function object.
+ return _function_contents(obj)
+
+
+def _object_contents(obj):
+ """Return the signature contents of any Python object.
+
+ We have to handle the case where object contains a code object
+ since it can be pickled directly.
+ """
+ try:
+ # Test if obj is a method.
+ return _function_contents(obj.im_func)
+
+ except AttributeError:
+ try:
+ # Test if obj is a callable object.
+ return _function_contents(obj.__call__.im_func)
+
+ except AttributeError:
+ try:
+ # Test if obj is a code object.
+ return _code_contents(obj)
+
+ except AttributeError:
+ try:
+ # Test if obj is a function object.
+ return _function_contents(obj)
+
+ except AttributeError:
+ # Should be a pickable Python object.
+ try:
+ return cPickle.dumps(obj)
+ except (cPickle.PicklingError, TypeError):
+ # This is weird, but it seems that nested classes
+ # are unpickable. The Python docs say it should
+ # always be a PicklingError, but some Python
+ # versions seem to return TypeError. Just do
+ # the best we can.
+ return str(obj)
+
+
+def _code_contents(code):
+ """Return the signature contents of a code object.
+
+ By providing direct access to the code object of the
+ function, Python makes this extremely easy. Hooray!
+
+ Unfortunately, older versions of Python include line
+ number indications in the compiled byte code. Boo!
+ So we remove the line number byte codes to prevent
+ recompilations from moving a Python function.
+ """
+
+ contents = []
+
+ # The code contents depends on the number of local variables
+ # but not their actual names.
+ contents.append("%s,%s" % (code.co_argcount, len(code.co_varnames)))
+ try:
+ contents.append(",%s,%s" % (len(code.co_cellvars), len(code.co_freevars)))
+ except AttributeError:
+ # Older versions of Python do not support closures.
+ contents.append(",0,0")
+
+ # The code contents depends on any constants accessed by the
+ # function. Note that we have to call _object_contents on each
+ # constants because the code object of nested functions can
+ # show-up among the constants.
+ #
+ # Note that we also always ignore the first entry of co_consts
+ # which contains the function doc string. We assume that the
+ # function does not access its doc string.
+ contents.append(',(' + string.join(map(_object_contents,code.co_consts[1:]),',') + ')')
+
+ # The code contents depends on the variable names used to
+ # accessed global variable, as changing the variable name changes
+ # the variable actually accessed and therefore changes the
+ # function result.
+ contents.append(',(' + string.join(map(_object_contents,code.co_names),',') + ')')
+
+
+ # The code contents depends on its actual code!!!
+ contents.append(',(' + str(remove_set_lineno_codes(code.co_code)) + ')')
+
+ return string.join(contents, '')
+
+
+def _function_contents(func):
+ """Return the signature contents of a function."""
+
+ contents = [_code_contents(func.func_code)]
+
+ # The function contents depends on the value of defaults arguments
+ if func.func_defaults:
+ contents.append(',(' + string.join(map(_object_contents,func.func_defaults),',') + ')')
+ else:
+ contents.append(',()')
+
+ # The function contents depends on the closure captured cell values.
+ try:
+ closure = func.func_closure or []
+ except AttributeError:
+ # Older versions of Python do not support closures.
+ closure = []
+
+ #xxx = [_object_contents(x.cell_contents) for x in closure]
+ xxx = map(lambda x: _object_contents(x.cell_contents), closure)
+ contents.append(',(' + string.join(xxx, ',') + ')')
+
+ return string.join(contents, '')
+
+
def _actionAppend(act1, act2):
# This function knows how to slap two actions together.
# Mainly, it handles ListActions by concatenating into
@@ -643,6 +776,16 @@ class FunctionAction(_ActionAction):
'accepts (target, source, env) as parameters.')
self.execfunction = execfunction
+ try:
+ self.funccontents = _callable_contents(execfunction)
+ except AttributeError:
+ try:
+ # See if execfunction will do the heavy lifting for us.
+ self.gc = execfunction.get_contents
+ except AttributeError:
+ # This is weird, just do the best we can.
+ self.funccontents = _object_contents(execfunction)
+
apply(_ActionAction.__init__, (self,)+args, kw)
self.varlist = kw.get('varlist', [])
self.cmdstr = cmdstr
@@ -716,46 +859,14 @@ class FunctionAction(_ActionAction):
return result
def get_contents(self, target, source, env):
- """Return the signature contents of this callable action.
-
- By providing direct access to the code object of the
- function, Python makes this extremely easy. Hooray!
-
- Unfortunately, older versions of Python include line
- number indications in the compiled byte code. Boo!
- So we remove the line number byte codes to prevent
- recompilations from moving a Python function.
- """
- execfunction = self.execfunction
+ """Return the signature contents of this callable action."""
try:
- # Test if execfunction is a function.
- code = execfunction.func_code.co_code
+ contents = self.gc(target, source, env)
except AttributeError:
- try:
- # Test if execfunction is a method.
- code = execfunction.im_func.func_code.co_code
- except AttributeError:
- try:
- # Test if execfunction is a callable object.
- code = execfunction.__call__.im_func.func_code.co_code
- except AttributeError:
- try:
- # See if execfunction will do the heavy lifting for us.
- gc = self.execfunction.get_contents
- except AttributeError:
- # This is weird, just do the best we can.
- contents = str(self.execfunction)
- else:
- contents = gc(target, source, env)
- else:
- contents = str(code)
- else:
- contents = str(code)
- else:
- contents = str(code)
- contents = remove_set_lineno_codes(contents)
+ contents = self.funccontents
+
return contents + env.subst(string.join(map(lambda v: '${'+v+'}',
- self.varlist)))
+ self.varlist)))
def get_implicit_deps(self, target, source, env):
return []
diff --git a/src/engine/SCons/ActionTests.py b/src/engine/SCons/ActionTests.py
index 06030e3..2ad4bef 100644
--- a/src/engine/SCons/ActionTests.py
+++ b/src/engine/SCons/ActionTests.py
@@ -1490,25 +1490,30 @@ class FunctionActionTestCase(unittest.TestCase):
def LocalFunc():
pass
- matches = [
- "d\000\000S",
- "d\x00\x00S",
+ func_matches = [
+ "0,0,0,0,(),(),(d\000\000S),(),()",
+ "0,0,0,0,(),(),(d\x00\x00S),(),()",
+ ]
+
+ meth_matches = [
+ "1,1,0,0,(),(),(d\000\000S),(),()",
+ "1,1,0,0,(),(),(d\x00\x00S),(),()",
]
a = SCons.Action.FunctionAction(GlobalFunc)
c = a.get_contents(target=[], source=[], env=Environment())
- assert c in matches, repr(c)
+ assert c in func_matches, repr(c)
a = SCons.Action.FunctionAction(LocalFunc)
c = a.get_contents(target=[], source=[], env=Environment())
- assert c in matches, repr(c)
+ assert c in func_matches, repr(c)
a = SCons.Action.FunctionAction(GlobalFunc, varlist=['XYZ'])
- matches_foo = map(lambda x: x + "foo", matches)
+ matches_foo = map(lambda x: x + "foo", func_matches)
c = a.get_contents(target=[], source=[], env=Environment())
- assert c in matches, repr(c)
+ assert c in func_matches, repr(c)
c = a.get_contents(target=[], source=[], env=Environment(XYZ = 'foo'))
assert c in matches_foo, repr(c)
@@ -1525,7 +1530,7 @@ class FunctionActionTestCase(unittest.TestCase):
lc = LocalClass()
a = SCons.Action.FunctionAction(lc.LocalMethod)
c = a.get_contents(target=[], source=[], env=Environment())
- assert c in matches, repr(c)
+ assert c in meth_matches, repr(c)
def test_strfunction(self):
"""Test the FunctionAction.strfunction() method
diff --git a/src/engine/SCons/CacheDir.py b/src/engine/SCons/CacheDir.py
index 9b2b4b4..7caee61 100644
--- a/src/engine/SCons/CacheDir.py
+++ b/src/engine/SCons/CacheDir.py
@@ -34,6 +34,7 @@ import sys
import SCons.Action
+cache_enabled = True
cache_debug = False
cache_force = False
cache_show = False
@@ -129,31 +130,33 @@ class CacheDir:
except ImportError:
msg = "No hashlib or MD5 module available, CacheDir() not supported"
SCons.Warnings.warn(SCons.Warnings.NoMD5ModuleWarning, msg)
+ self.path = None
else:
self.path = path
+ self.current_cache_debug = None
+ self.debugFP = None
- def CacheDebugWrite(self, fmt, target, cachefile):
- self.debugFP.write(fmt % (target, os.path.split(cachefile)[1]))
-
- def CacheDebugQuiet(self, fmt, target, cachefile):
- pass
-
- def CacheDebugInit(self, fmt, target, cachefile):
- if cache_debug:
+ def CacheDebug(self, fmt, target, cachefile):
+ if cache_debug != self.current_cache_debug:
if cache_debug == '-':
self.debugFP = sys.stdout
- else:
+ elif cache_debug:
self.debugFP = open(cache_debug, 'w')
- self.CacheDebug = self.CacheDebugWrite
- self.CacheDebug(fmt, target, cachefile)
- else:
- self.CacheDebug = self.CacheDebugQuiet
+ else:
+ self.debugFP = None
+ self.current_cache_debug = cache_debug
+ if self.debugFP:
+ self.debugFP.write(fmt % (target, os.path.split(cachefile)[1]))
- CacheDebug = CacheDebugInit
+ def is_enabled(self):
+ return (cache_enabled and not self.path is None)
def cachepath(self, node):
"""
"""
+ if not self.is_enabled():
+ return None, None
+
sig = node.get_cachedir_bsig()
subdir = string.upper(sig[0])
dir = os.path.join(self.path, subdir)
@@ -184,6 +187,9 @@ class CacheDir:
execute the CacheRetrieveFunc and then have the latter
explicitly check SCons.Action.execute_actions itself.
"""
+ if not self.is_enabled():
+ return False
+
retrieved = False
if cache_show:
@@ -202,16 +208,10 @@ class CacheDir:
return retrieved
def push(self, node):
+ if not self.is_enabled():
+ return
return CachePush(node, [], node.get_build_env())
def push_if_forced(self, node):
if cache_force:
return self.push(node)
-
-class Null(SCons.Util.Null):
- def repr(self):
- return 'CacheDir.Null()'
- def cachepath(self, node):
- return None, None
- def retrieve(self, node):
- return False
diff --git a/src/engine/SCons/Conftest.py b/src/engine/SCons/Conftest.py
index fcf8c5a..33899f6 100644
--- a/src/engine/SCons/Conftest.py
+++ b/src/engine/SCons/Conftest.py
@@ -371,11 +371,10 @@ int main()
}
"""
- # XXX: Try* vs CompileProg ?
- st = context.TryCompile(src % (type_name, expect), suffix)
- if st:
- _Have(context, "SIZEOF_" + type_name, str(expect))
+ st = context.CompileProg(src % (type_name, expect), suffix)
+ if not st:
context.Display("yes\n")
+ _Have(context, "SIZEOF_%s" % type_name, expect)
return expect
else:
context.Display("no\n")
@@ -400,21 +399,76 @@ int main() {
return 0;
}
"""
- ret = context.TryRun(src, suffix)
- st = ret[0]
+ st, out = context.RunProg(src, suffix)
try:
- size = int(ret[1])
- _Have(context, "SIZEOF_" + type_name, str(size))
- context.Display("%d\n" % size)
+ size = int(out)
except ValueError:
+ # If cannot convert output of test prog to an integer (the size),
+ # something went wront, so just fail
+ st = 1
size = 0
- _LogFailed(context, src, st)
- context.Display(" Failed !\n")
- if st:
+
+ if not st:
+ context.Display("yes\n")
+ _Have(context, "SIZEOF_%s" % type_name, size)
return size
else:
+ context.Display("no\n")
+ _LogFailed(context, src, st)
return 0
+ return 0
+
+def CheckDeclaration(context, symbol, includes = None, language = None):
+ """Checks whether symbol is declared.
+
+ Use the same test as autoconf, that is test whether the symbol is defined
+ as a macro or can be used as an r-value.
+
+ Arguments:
+ symbol : str
+ the symbol to check
+ includes : str
+ Optional "header" can be defined to include a header file.
+ language : str
+ only C and C++ supported.
+
+ Returns:
+ status : bool
+ True if the check failed, False if succeeded."""
+
+ # Include "confdefs.h" first, so that the header can use HAVE_HEADER_H.
+ if context.headerfilename:
+ includetext = '#include "%s"' % context.headerfilename
+ else:
+ includetext = ''
+
+ if not includes:
+ includes = ""
+
+ lang, suffix, msg = _lang2suffix(language)
+ if msg:
+ context.Display("Cannot check for declaration %s: %s\n" % (type_name, msg))
+ return msg
+
+ src = includetext + includes
+ context.Display('Checking whether %s is declared... ' % symbol)
+
+ src = src + r"""
+int main()
+{
+#ifndef %s
+ (void) %s;
+#endif
+ ;
+ return 0;
+}
+""" % (symbol, symbol)
+
+ st = context.CompileProg(src, suffix)
+ _YesNoResult(context, st, "HAVE_DECL_" + symbol, src)
+ return st
+
def CheckLib(context, libs, func_name = None, header = None,
extra_libs = None, call = None, language = None, autoadd = 1):
"""
diff --git a/src/engine/SCons/Defaults.py b/src/engine/SCons/Defaults.py
index c3d30cb..3cd47ef 100644
--- a/src/engine/SCons/Defaults.py
+++ b/src/engine/SCons/Defaults.py
@@ -93,7 +93,7 @@ def DefaultEnvironment(*args, **kw):
_default_env.Decider('timestamp-match')
global DefaultEnvironment
DefaultEnvironment = _fetch_DefaultEnvironment
- _default_env._CacheDir = SCons.CacheDir.Null()
+ _default_env._CacheDir_path = None
return _default_env
# Emitters for setting the shared attribute on object files,
@@ -270,7 +270,7 @@ def _concat_ixes(prefix, list, suffix, env):
return result
-def _stripixes(prefix, list, suffix, stripprefix, stripsuffix, env, c=None):
+def _stripixes(prefix, list, suffix, stripprefixes, stripsuffixes, env, c=None):
"""
This is a wrapper around _concat()/_concat_ixes() that checks for the
existence of prefixes or suffixes on list elements and strips them
@@ -295,19 +295,39 @@ def _stripixes(prefix, list, suffix, stripprefix, stripsuffix, env, c=None):
if SCons.Util.is_List(list):
list = SCons.Util.flatten(list)
- lsp = len(stripprefix)
- lss = len(stripsuffix)
+ if SCons.Util.is_List(stripprefixes):
+ stripprefixes = map(env.subst, SCons.Util.flatten(stripprefixes))
+ else:
+ stripprefixes = [env.subst(stripprefixes)]
+
+ if SCons.Util.is_List(stripsuffixes):
+ stripsuffixes = map(env.subst, SCons.Util.flatten(stripsuffixes))
+ else:
+ stripsuffixes = [stripsuffixes]
+
stripped = []
for l in SCons.PathList.PathList(list).subst_path(env, None, None):
if isinstance(l, SCons.Node.FS.File):
stripped.append(l)
continue
+
if not SCons.Util.is_String(l):
l = str(l)
- if l[:lsp] == stripprefix:
- l = l[lsp:]
- if l[-lss:] == stripsuffix:
- l = l[:-lss]
+
+ for stripprefix in stripprefixes:
+ lsp = len(stripprefix)
+ if l[:lsp] == stripprefix:
+ l = l[lsp:]
+ # Do not strip more than one prefix
+ break
+
+ for stripsuffix in stripsuffixes:
+ lss = len(stripsuffix)
+ if l[-lss:] == stripsuffix:
+ l = l[:-lss]
+ # Do not strip more than one suffix
+ break
+
stripped.append(l)
return c(prefix, stripped, suffix, env)
diff --git a/src/engine/SCons/Environment.py b/src/engine/SCons/Environment.py
index cf2d0eb..02ad332 100644
--- a/src/engine/SCons/Environment.py
+++ b/src/engine/SCons/Environment.py
@@ -481,7 +481,7 @@ class SubstitutionEnvironment:
# We have an object plus a string, or multiple
# objects that we need to smush together. No choice
# but to make them into a string.
- p = string.join(map(SCons.Util.to_String, p), '')
+ p = string.join(map(SCons.Util.to_String_for_subst, p), '')
else:
p = s(p)
r.append(p)
@@ -909,11 +909,18 @@ class Base(SubstitutionEnvironment):
def get_CacheDir(self):
try:
- return self._CacheDir
+ path = self._CacheDir_path
except AttributeError:
- cd = SCons.Defaults.DefaultEnvironment()._CacheDir
- self._CacheDir = cd
- return cd
+ path = SCons.Defaults.DefaultEnvironment()._CacheDir_path
+ try:
+ if path == self._last_CacheDir_path:
+ return self._last_CacheDir
+ except AttributeError:
+ pass
+ cd = SCons.CacheDir.CacheDir(path)
+ self._last_CacheDir_path = path
+ self._last_CacheDir = cd
+ return cd
def get_factory(self, factory, default='File'):
"""Return a factory function for creating Nodes for this
@@ -1645,10 +1652,9 @@ class Base(SubstitutionEnvironment):
def CacheDir(self, path):
import SCons.CacheDir
- if path is None:
- self._CacheDir = SCons.CacheDir.Null()
- else:
- self._CacheDir = SCons.CacheDir.CacheDir(self.subst(path))
+ if not path is None:
+ path = self.subst(path)
+ self._CacheDir_path = path
def Clean(self, targets, files):
global CleanTargets
diff --git a/src/engine/SCons/EnvironmentTests.py b/src/engine/SCons/EnvironmentTests.py
index 3f64d43..4ffff7a 100644
--- a/src/engine/SCons/EnvironmentTests.py
+++ b/src/engine/SCons/EnvironmentTests.py
@@ -576,13 +576,13 @@ class SubstitutionTestCase(unittest.TestCase):
BAR=StringableObj("bar"))
r = env.subst_path([ "${FOO}/bar", "${BAR}/baz" ])
- assert r == [ "foo/bar", "bar/baz" ]
+ assert r == [ "foo/bar", "bar/baz" ], r
r = env.subst_path([ "bar/${FOO}", "baz/${BAR}" ])
- assert r == [ "bar/foo", "baz/bar" ]
+ assert r == [ "bar/foo", "baz/bar" ], r
r = env.subst_path([ "bar/${FOO}/bar", "baz/${BAR}/baz" ])
- assert r == [ "bar/foo/bar", "baz/bar/baz" ]
+ assert r == [ "bar/foo/bar", "baz/bar/baz" ], r
def test_subst_target_source(self):
"""Test the base environment subst_target_source() method"""
@@ -764,40 +764,6 @@ sys.exit(1)
d = env.ParseFlags(s)
- if sys.version[:3] in ('1.5', '1.6', '2.0', '2.1', '2.2'):
- # Pre-2.3 Python has no shlex.split() function.
- # The compatibility layer does its best can by wrapping
- # the old shlex.shlex class, but that class doesn't really
- # understand quoting within the body of a token. We're just
- # going to live with this; it's the behavior they'd
- # have anyway if they use the shlex module...
- #
- # (Note that we must test the actual Python version numbers
- # above, not just test for whether trying to use shlex.split()
- # throws an AttributeError, because the compatibility layer
- # adds our wrapper function to the module as shlex.split().)
-
- expect_CPPPATH = ['/usr/include/fum',
- 'bar',
- '"C:\\Program']
- expect_LIBPATH = ['/usr/fax',
- 'foo',
- '"C:\\Program']
- expect_LIBS = ['Files\\ASCEND\\include"',
- 'xxx',
- 'yyy',
- 'Files\\ASCEND"',
- 'ascend']
- else:
- expect_CPPPATH = ['/usr/include/fum',
- 'bar',
- 'C:\\Program Files\\ASCEND\\include']
- expect_LIBPATH = ['/usr/fax',
- 'foo',
- 'C:\\Program Files\\ASCEND']
- expect_LIBS = ['xxx', 'yyy', 'ascend']
-
-
assert d['ASFLAGS'] == ['-as'], d['ASFLAGS']
assert d['CFLAGS'] == ['-std=c99']
assert d['CCFLAGS'] == ['-X', '-Wa,-as',
@@ -806,12 +772,16 @@ sys.exit(1)
'+DD64'], d['CCFLAGS']
assert d['CPPDEFINES'] == ['FOO', ['BAR', 'value'], 'BAZ'], d['CPPDEFINES']
assert d['CPPFLAGS'] == ['-Wp,-cpp'], d['CPPFLAGS']
- assert d['CPPPATH'] == expect_CPPPATH, d['CPPPATH']
+ assert d['CPPPATH'] == ['/usr/include/fum',
+ 'bar',
+ 'C:\\Program Files\\ASCEND\\include'], d['CPPPATH']
assert d['FRAMEWORKPATH'] == ['fwd1', 'fwd2', 'fwd3'], d['FRAMEWORKPATH']
assert d['FRAMEWORKS'] == ['Carbon'], d['FRAMEWORKS']
- assert d['LIBPATH'] == expect_LIBPATH, d['LIBPATH']
+ assert d['LIBPATH'] == ['/usr/fax',
+ 'foo',
+ 'C:\\Program Files\\ASCEND'], d['LIBPATH']
LIBS = map(str, d['LIBS'])
- assert LIBS == expect_LIBS, (d['LIBS'], LIBS)
+ assert LIBS == ['xxx', 'yyy', 'ascend'], (d['LIBS'], LIBS)
assert d['LINKFLAGS'] == ['-Wl,-link', '-pthread',
'-mno-cygwin', '-mwindows',
('-arch', 'i386'),
@@ -2589,10 +2559,10 @@ def generate(env):
env = self.TestEnvironment(CD = 'CacheDir')
env.CacheDir('foo')
- assert env._CacheDir.path == 'foo', env._CacheDir.path
+ assert env._CacheDir_path == 'foo', env._CacheDir_path
env.CacheDir('$CD')
- assert env._CacheDir.path == 'CacheDir', env._CacheDir.path
+ assert env._CacheDir_path == 'CacheDir', env._CacheDir_path
def test_Clean(self):
"""Test the Clean() method"""
diff --git a/src/engine/SCons/Executor.py b/src/engine/SCons/Executor.py
index 1cb0cf9..7222042 100644
--- a/src/engine/SCons/Executor.py
+++ b/src/engine/SCons/Executor.py
@@ -335,13 +335,11 @@ class Null(_Executor):
def get_build_env(self):
import SCons.Util
class NullEnvironment(SCons.Util.Null):
- #def get_scanner(self, key):
- # return None
- #def changed_since_last_build(self, dependency, target, prev_ni):
- # return dependency.changed_since_last_buld(target, prev_ni)
+ import SCons.CacheDir
+ _CacheDir_path = None
+ _CacheDir = SCons.CacheDir.CacheDir(None)
def get_CacheDir(self):
- import SCons.CacheDir
- return SCons.CacheDir.Null()
+ return self._CacheDir
return NullEnvironment()
def get_build_scanner_path(self):
return None
diff --git a/src/engine/SCons/Job.py b/src/engine/SCons/Job.py
index b28aaaf..7b51409 100644
--- a/src/engine/SCons/Job.py
+++ b/src/engine/SCons/Job.py
@@ -33,6 +33,18 @@ __revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
import SCons.compat
+
+# The default stack size (in kilobytes) of the threads used to execute
+# jobs in parallel.
+#
+# We use a stack size of 256 kilobytes. The default on some platforms
+# is too large and prevents us from creating enough threads to fully
+# parallelized the build. For example, the default stack size on linux
+# is 8 MBytes.
+
+default_stack_size = 256
+
+
class Jobs:
"""An instance of this class initializes N jobs, and provides
methods for starting, stopping, and waiting on all N jobs.
@@ -55,7 +67,12 @@ class Jobs:
self.job = None
if num > 1:
try:
- self.job = Parallel(taskmaster, num)
+ stack_size = SCons.Job.stack_size
+ except AttributeError:
+ stack_size = default_stack_size
+
+ try:
+ self.job = Parallel(taskmaster, num, stack_size)
self.num_jobs = num
except NameError:
pass
@@ -175,17 +192,40 @@ else:
class ThreadPool:
"""This class is responsible for spawning and managing worker threads."""
- def __init__(self, num):
- """Create the request and reply queues, and 'num' worker threads."""
+ def __init__(self, num, stack_size):
+ """Create the request and reply queues, and 'num' worker threads.
+
+ One must specify the stack size of the worker threads. The
+ stack size is specified in kilobytes.
+ """
self.requestQueue = Queue.Queue(0)
self.resultsQueue = Queue.Queue(0)
+ try:
+ prev_size = threading.stack_size(stack_size*1024)
+ except AttributeError, e:
+ # Only print a warning if the stack size has been
+ # explicitely set.
+ if hasattr(SCons.Job, 'stack_size'):
+ msg = "Setting stack size is unsupported by this version of Python:\n " + \
+ e.args[0]
+ SCons.Warnings.warn(SCons.Warnings.StackSizeWarning, msg)
+ except ValueError, e:
+ msg = "Setting stack size failed:\n " + \
+ e.message
+ SCons.Warnings.warn(SCons.Warnings.StackSizeWarning, msg)
+
# Create worker threads
self.workers = []
for _ in range(num):
worker = Worker(self.requestQueue, self.resultsQueue)
self.workers.append(worker)
+ # Once we drop Python 1.5 we can change the following to:
+ #if 'prev_size' in locals():
+ if 'prev_size' in locals().keys():
+ threading.stack_size(prev_size)
+
def put(self, obj):
"""Put task into request queue."""
self.requestQueue.put(obj)
@@ -233,7 +273,7 @@ else:
This class is thread safe.
"""
- def __init__(self, taskmaster, num):
+ def __init__(self, taskmaster, num, stack_size):
"""Create a new parallel job given a taskmaster.
The taskmaster's next_task() method should return the next
@@ -249,7 +289,7 @@ else:
multiple tasks simultaneously. """
self.taskmaster = taskmaster
- self.tp = ThreadPool(num)
+ self.tp = ThreadPool(num, stack_size)
self.maxjobs = num
diff --git a/src/engine/SCons/JobTests.py b/src/engine/SCons/JobTests.py
index 5f056e8..c432581 100644
--- a/src/engine/SCons/JobTests.py
+++ b/src/engine/SCons/JobTests.py
@@ -293,7 +293,7 @@ class SerialTestCase(unittest.TestCase):
class NoParallelTestCase(unittest.TestCase):
def runTest(self):
"test handling lack of parallel support"
- def NoParallel(tm, num):
+ def NoParallel(tm, num, stack_size):
raise NameError
save_Parallel = SCons.Job.Parallel
SCons.Job.Parallel = NoParallel
diff --git a/src/engine/SCons/Memoize.py b/src/engine/SCons/Memoize.py
index c2b4181..c4a5001 100644
--- a/src/engine/SCons/Memoize.py
+++ b/src/engine/SCons/Memoize.py
@@ -217,33 +217,47 @@ class Memoizer:
class M:
def __init__(cls, name, bases, cls_dict):
- cls.has_metaclass = 1
-
-class A:
- __metaclass__ = M
+ cls.use_metaclass = 1
+ def fake_method(self):
+ pass
+ new.instancemethod(fake_method, None, cls)
try:
- has_metaclass = A.has_metaclass
+ class A:
+ __metaclass__ = M
+
+ use_metaclass = A.use_metaclass
except AttributeError:
- has_metaclass = None
+ use_metaclass = None
+ reason = 'no metaclasses'
+except TypeError:
+ use_metaclass = None
+ reason = 'new.instancemethod() bug'
+else:
+ del A
del M
-del A
-if not has_metaclass:
+if not use_metaclass:
def Dump(title):
pass
- class Memoized_Metaclass:
- # Just a place-holder so pre-metaclass Python versions don't
- # have to have special code for the Memoized classes.
- pass
+ try:
+ class Memoized_Metaclass(type):
+ # Just a place-holder so pre-metaclass Python versions don't
+ # have to have special code for the Memoized classes.
+ pass
+ except TypeError:
+ class Memoized_Metaclass:
+ # A place-holder so pre-metaclass Python versions don't
+ # have to have special code for the Memoized classes.
+ pass
def EnableMemoization():
import SCons.Warnings
- msg = 'memoization is not supported in this version of Python (no metaclasses)'
- raise SCons.Warnings.NoMetaclassSupportWarning, msg
+ msg = 'memoization is not supported in this version of Python (%s)'
+ raise SCons.Warnings.NoMetaclassSupportWarning, msg % reason
else:
diff --git a/src/engine/SCons/MemoizeTests.py b/src/engine/SCons/MemoizeTests.py
index 7102f30..bceeebf 100644
--- a/src/engine/SCons/MemoizeTests.py
+++ b/src/engine/SCons/MemoizeTests.py
@@ -132,7 +132,7 @@ class CountDictTestCase(unittest.TestCase):
c = obj.get_memoizer_counter('dict')
- if SCons.Memoize.has_metaclass:
+ if SCons.Memoize.use_metaclass:
assert c.hit == 3, c.hit
assert c.miss == 2, c.miss
else:
@@ -171,7 +171,7 @@ class CountValueTestCase(unittest.TestCase):
c = obj.get_memoizer_counter('value')
- if SCons.Memoize.has_metaclass:
+ if SCons.Memoize.use_metaclass:
assert c.hit == 3, c.hit
assert c.miss == 1, c.miss
else:
diff --git a/src/engine/SCons/Node/FS.py b/src/engine/SCons/Node/FS.py
index d0843d1..1a3c010 100644
--- a/src/engine/SCons/Node/FS.py
+++ b/src/engine/SCons/Node/FS.py
@@ -2469,38 +2469,20 @@ class File(Base):
self.get_build_env().get_CacheDir().push_if_forced(self)
ninfo = self.get_ninfo()
- old = self.get_stored_info()
-
- csig = None
- mtime = self.get_timestamp()
- size = self.get_size()
-
- max_drift = self.fs.max_drift
- if max_drift > 0:
- if (time.time() - mtime) > max_drift:
- try:
- n = old.ninfo
- if n.timestamp and n.csig and n.timestamp == mtime:
- csig = n.csig
- except AttributeError:
- pass
- elif max_drift == 0:
- try:
- csig = old.ninfo.csig
- except AttributeError:
- pass
+ csig = self.get_max_drift_csig()
if csig:
ninfo.csig = csig
- ninfo.timestamp = mtime
- ninfo.size = size
+ ninfo.timestamp = self.get_timestamp()
+ ninfo.size = self.get_size()
if not self.has_builder():
# This is a source file, but it might have been a target file
# in another build that included more of the DAG. Copy
# any build information that's stored in the .sconsign file
# into our binfo object so it doesn't get lost.
+ old = self.get_stored_info()
self.get_binfo().__dict__.update(old.binfo.__dict__)
self.store_info()
@@ -2638,6 +2620,33 @@ class File(Base):
# SIGNATURE SUBSYSTEM
#
+ def get_max_drift_csig(self):
+ """
+ Returns the content signature currently stored for this node
+ if it's been unmodified longer than the max_drift value, or the
+ max_drift value is 0. Returns None otherwise.
+ """
+ old = self.get_stored_info()
+ mtime = self.get_timestamp()
+
+ csig = None
+ max_drift = self.fs.max_drift
+ if max_drift > 0:
+ if (time.time() - mtime) > max_drift:
+ try:
+ n = old.ninfo
+ if n.timestamp and n.csig and n.timestamp == mtime:
+ csig = n.csig
+ except AttributeError:
+ pass
+ elif max_drift == 0:
+ try:
+ csig = old.ninfo.csig
+ except AttributeError:
+ pass
+
+ return csig
+
def get_csig(self):
"""
Generate a node's content signature, the digested signature
@@ -2653,16 +2662,19 @@ class File(Base):
except AttributeError:
pass
- try:
- contents = self.get_contents()
- except IOError:
- # This can happen if there's actually a directory on-disk,
- # which can be the case if they've disabled disk checks,
- # or if an action with a File target actually happens to
- # create a same-named directory by mistake.
- csig = ''
- else:
- csig = SCons.Util.MD5signature(contents)
+ csig = self.get_max_drift_csig()
+ if csig is None:
+
+ try:
+ contents = self.get_contents()
+ except IOError:
+ # This can happen if there's actually a directory on-disk,
+ # which can be the case if they've disabled disk checks,
+ # or if an action with a File target actually happens to
+ # create a same-named directory by mistake.
+ csig = ''
+ else:
+ csig = SCons.Util.MD5signature(contents)
ninfo.csig = csig
@@ -2842,14 +2854,14 @@ class FileFinder:
It would be more compact to just use this as a nested function
with a default keyword argument (see the commented-out version
below), but that doesn't work unless you have nested scopes,
- so we define it here just this works work under Python 1.5.2.
+ so we define it here just so this work under Python 1.5.2.
"""
if fd is None:
fd = self.default_filedir
dir, name = os.path.split(fd)
drive, d = os.path.splitdrive(dir)
if d in ('/', os.sep):
- return p
+ return p.fs.get_root(drive).dir_on_disk(name)
if dir:
p = self.filedir_lookup(p, dir)
if not p:
diff --git a/src/engine/SCons/Node/NodeTests.py b/src/engine/SCons/Node/NodeTests.py
index fe42035..8e9a3f8 100644
--- a/src/engine/SCons/Node/NodeTests.py
+++ b/src/engine/SCons/Node/NodeTests.py
@@ -630,13 +630,13 @@ class NodeTestCase(unittest.TestCase):
# XXX additional tests for the guts of the functionality some day
- def test_del_binfo(self):
- """Test deleting the build information from a Node
- """
- node = SCons.Node.Node()
- node.binfo = None
- node.del_binfo()
- assert not hasattr(node, 'binfo'), node
+ #def test_del_binfo(self):
+ # """Test deleting the build information from a Node
+ # """
+ # node = SCons.Node.Node()
+ # node.binfo = None
+ # node.del_binfo()
+ # assert not hasattr(node, 'binfo'), node
def test_store_info(self):
"""Test calling the method to store build information
diff --git a/src/engine/SCons/Node/__init__.py b/src/engine/SCons/Node/__init__.py
index f252151..4ca34e0 100644
--- a/src/engine/SCons/Node/__init__.py
+++ b/src/engine/SCons/Node/__init__.py
@@ -375,7 +375,6 @@ class Node:
# waiting for this Node to be built.
for parent in self.waiting_parents.keys():
parent.implicit = None
- parent.del_binfo()
self.clear()
@@ -433,14 +432,10 @@ class Node:
can be re-evaluated by interfaces that do continuous integration
builds).
"""
- # Note in case it's important in the future: We also used to clear
- # the build information (the lists of dependencies) here like this:
- #
- # self.del_binfo()
- #
- # But we now rely on the fact that we're going to look at that
- # once before the build, and then store the results in the
- # .sconsign file after the build.
+ # The del_binfo() call here isn't necessary for normal execution,
+ # but is for interactive mode, where we might rebuild the same
+ # target and need to start from scratch.
+ self.del_binfo()
self.clear_memoized_values()
self.ninfo = self.new_ninfo()
self.executor_cleanup()
@@ -639,8 +634,6 @@ class Node:
# so we must recalculate the implicit deps:
self.implicit = []
self.implicit_dict = {}
- self._children_reset()
- self.del_binfo()
# Have the executor scan the sources.
executor.scan_sources(self.builder.source_scanner)
@@ -1013,6 +1006,7 @@ class Node:
# entries to equal the new dependency list, for the benefit
# of the loop below that updates node information.
then.extend([None] * diff)
+ if t: Trace(': old %s new %s' % (len(then), len(children)))
result = True
for child, prev_ni in zip(children, then):
diff --git a/src/engine/SCons/PathList.py b/src/engine/SCons/PathList.py
index be645ca..ae00fc0 100644
--- a/src/engine/SCons/PathList.py
+++ b/src/engine/SCons/PathList.py
@@ -59,7 +59,7 @@ def node_conv(obj):
try:
get = obj.get
except AttributeError:
- if isinstance(obj, SCons.Node.Node):
+ if isinstance(obj, SCons.Node.Node) or SCons.Util.is_Sequence( obj ):
result = obj
else:
result = str(obj)
@@ -132,10 +132,9 @@ class _PathList:
value = env.subst(value, target=target, source=source,
conv=node_conv)
if SCons.Util.is_Sequence(value):
- # It came back as a string or tuple, which in this
- # case usually means some variable expanded to an
- # actually Dir node. Concatenate the values.
- value = string.join(map(str, value), '')
+ result.extend(value)
+ continue
+
elif type == TYPE_OBJECT:
value = node_conv(value)
if value:
diff --git a/src/engine/SCons/Platform/posix.py b/src/engine/SCons/Platform/posix.py
index 1d4e9f7..afdabe1 100644
--- a/src/engine/SCons/Platform/posix.py
+++ b/src/engine/SCons/Platform/posix.py
@@ -235,7 +235,7 @@ def generate(env):
env['LIBSUFFIX'] = '.a'
env['SHLIBPREFIX'] = '$LIBPREFIX'
env['SHLIBSUFFIX'] = '.so'
- env['LIBPREFIXES'] = '$LIBPREFIX'
+ env['LIBPREFIXES'] = [ '$LIBPREFIX' ]
env['LIBSUFFIXES'] = [ '$LIBSUFFIX', '$SHLIBSUFFIX' ]
env['PSPAWN'] = pspawn
env['SPAWN'] = spawn
diff --git a/src/engine/SCons/SConf.py b/src/engine/SCons/SConf.py
index ae3a77e..c5de498 100644
--- a/src/engine/SCons/SConf.py
+++ b/src/engine/SCons/SConf.py
@@ -404,11 +404,12 @@ class SConfBase:
'CheckFunc' : CheckFunc,
'CheckType' : CheckType,
'CheckTypeSize' : CheckTypeSize,
+ 'CheckDeclaration' : CheckDeclaration,
'CheckHeader' : CheckHeader,
'CheckCHeader' : CheckCHeader,
'CheckCXXHeader' : CheckCXXHeader,
'CheckLib' : CheckLib,
- 'CheckLibWithHeader' : CheckLibWithHeader
+ 'CheckLibWithHeader' : CheckLibWithHeader,
}
self.AddTests(default_tests)
self.AddTests(custom_tests)
@@ -425,6 +426,31 @@ class SConfBase:
self._shutdown()
return self.env
+ def Define(self, name, value = None, comment = None):
+ """
+ Define a pre processor symbol name, with the optional given value in the
+ current config header.
+
+ If value is None (default), then #define name is written. If value is not
+ none, then #define name value is written.
+
+ comment is a string which will be put as a C comment in the
+ header, to explain the meaning of the value (appropriate C comments /* and
+ */ will be put automatically."""
+ lines = []
+ if comment:
+ comment_str = "/* %s */" % comment
+ lines.append(comment_str)
+
+ if value is not None:
+ define_str = "#define %s %s" % (name, value)
+ else:
+ define_str = "#define %s" % name
+ lines.append(define_str)
+ lines.append('')
+
+ self.config_h_text = self.config_h_text + string.join(lines, '\n')
+
def BuildNodes(self, nodes):
"""
Tries to build the given nodes immediately. Returns 1 on success,
@@ -797,6 +823,12 @@ class CheckContext:
# TODO: should use self.vardict for $CC, $CPPFLAGS, etc.
return not self.TryBuild(self.env.Object, text, ext)
+ def RunProg(self, text, ext):
+ self.sconf.cached = 1
+ # TODO: should use self.vardict for $CC, $CPPFLAGS, etc.
+ st, out = self.TryRun(text, ext)
+ return not st, out
+
def AppendLIBS(self, lib_name_list):
oldLIBS = self.env.get( 'LIBS', [] )
self.env.Append(LIBS = lib_name_list)
@@ -855,6 +887,13 @@ def CheckTypeSize(context, type_name, includes = "", language = None, expect = N
context.did_show_result = 1
return res
+def CheckDeclaration(context, declaration, includes = "", language = None):
+ res = SCons.Conftest.CheckDeclaration(context, declaration,
+ includes = includes,
+ language = language)
+ context.did_show_result = 1
+ return not res
+
def createIncludesFromHeaders(headers, leaveLast, include_quotes = '""'):
# used by CheckHeader and CheckLibWithHeader to produce C - #include
# statements from the specified header (list)
diff --git a/src/engine/SCons/SConfTests.py b/src/engine/SCons/SConfTests.py
index 601c5eb..f7d33f8 100644
--- a/src/engine/SCons/SConfTests.py
+++ b/src/engine/SCons/SConfTests.py
@@ -497,6 +497,42 @@ int main() {
finally:
sconf.Finish()
+ def test_Define(self):
+ """Test SConf.Define()
+ """
+ self._resetSConfState()
+ sconf = self.SConf.SConf(self.scons_env,
+ conf_dir=self.test.workpath('config.tests'),
+ log_file=self.test.workpath('config.log'),
+ config_h = self.test.workpath('config.h'))
+ try:
+ # XXX: we test the generated config.h string. This is not so good,
+ # ideally, we would like to test if the generated file included in
+ # a test program does what we want.
+
+ # Test defining one symbol wo value
+ sconf.config_h_text = ''
+ sconf.Define('YOP')
+ assert sconf.config_h_text == '#define YOP\n'
+
+ # Test defining one symbol with integer value
+ sconf.config_h_text = ''
+ sconf.Define('YOP', 1)
+ assert sconf.config_h_text == '#define YOP 1\n'
+
+ # Test defining one symbol with string value
+ sconf.config_h_text = ''
+ sconf.Define('YOP', '"YIP"')
+ assert sconf.config_h_text == '#define YOP "YIP"\n'
+
+ # Test defining one symbol with string value
+ sconf.config_h_text = ''
+ sconf.Define('YOP', "YIP")
+ assert sconf.config_h_text == '#define YOP YIP\n'
+
+ finally:
+ sconf.Finish()
+
def test_CheckTypeSize(self):
"""Test SConf.CheckTypeSize()
"""
@@ -531,6 +567,25 @@ int main() {
finally:
sconf.Finish()
+ def test_CheckDeclaration(self):
+ """Test SConf.CheckDeclaration()
+ """
+ self._resetSConfState()
+ sconf = self.SConf.SConf(self.scons_env,
+ conf_dir=self.test.workpath('config.tests'),
+ log_file=self.test.workpath('config.log'))
+ try:
+ # In ANSI C, malloc should be available in stdlib
+ r = sconf.CheckDeclaration('malloc', includes = "#include <stdlib.h>")
+ assert r, "malloc not declared ??"
+ # For C++, __cplusplus should be declared
+ r = sconf.CheckDeclaration('__cplusplus', language = 'C++')
+ assert r, "__cplusplus not declared in C++ ??"
+ r = sconf.CheckDeclaration('__cplusplus', language = 'C')
+ assert not r, "__cplusplus declared in C ??"
+ finally:
+ sconf.Finish()
+
def test_(self):
"""Test SConf.CheckType()
"""
diff --git a/src/engine/SCons/Scanner/C.py b/src/engine/SCons/Scanner/C.py
index 276570e..4356c7a 100644
--- a/src/engine/SCons/Scanner/C.py
+++ b/src/engine/SCons/Scanner/C.py
@@ -31,10 +31,94 @@ __revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
import SCons.Node.FS
import SCons.Scanner
+import SCons.Util
+
+import SCons.cpp
+
+class SConsCPPScanner(SCons.cpp.PreProcessor):
+ """
+ SCons-specific subclass of the cpp.py module's processing.
+
+ We subclass this so that: 1) we can deal with files represented
+ by Nodes, not strings; 2) we can keep track of the files that are
+ missing.
+ """
+ def __init__(self, *args, **kw):
+ apply(SCons.cpp.PreProcessor.__init__, (self,)+args, kw)
+ self.missing = []
+ def initialize_result(self, fname):
+ self.result = SCons.Util.UniqueList([fname])
+ def finalize_result(self, fname):
+ return self.result[1:]
+ def find_include_file(self, t):
+ keyword, quote, fname = t
+ result = SCons.Node.FS.find_file(fname, self.searchpath[quote])
+ if not result:
+ self.missing.append((fname, self.current_file))
+ return result
+ def read_file(self, file):
+ try:
+ fp = open(str(file.rfile()))
+ except EnvironmentError, e:
+ self.missing.append((file, self.current_file))
+ return ''
+ else:
+ return fp.read()
+
+def dictify_CPPDEFINES(env):
+ cppdefines = env.get('CPPDEFINES', {})
+ if cppdefines is None:
+ return {}
+ if SCons.Util.is_Sequence(cppdefines):
+ result = {}
+ for c in cppdefines:
+ if SCons.Util.is_Sequence(c):
+ result[c[0]] = c[1]
+ else:
+ result[c] = None
+ return result
+ if not SCons.Util.is_Dict(cppdefines):
+ return {cppdefines : None}
+ return cppdefines
+
+class SConsCPPScannerWrapper:
+ """
+ The SCons wrapper around a cpp.py scanner.
+
+ This is the actual glue between the calling conventions of generic
+ SCons scanners, and the (subclass of) cpp.py class that knows how
+ to look for #include lines with reasonably real C-preprocessor-like
+ evaluation of #if/#ifdef/#else/#elif lines.
+ """
+ def __init__(self, name, variable):
+ self.name = name
+ self.path = SCons.Scanner.FindPathDirs(variable)
+ def __call__(self, node, env, path = ()):
+ cpp = SConsCPPScanner(current = node.get_dir(),
+ cpppath = path,
+ dict = dictify_CPPDEFINES(env))
+ result = cpp(node)
+ for included, includer in cpp.missing:
+ fmt = "No dependency generated for file: %s (included from: %s) -- file not found"
+ SCons.Warnings.warn(SCons.Warnings.DependencyWarning,
+ fmt % (included, includer))
+ return result
+
+ def recurse_nodes(self, nodes):
+ return nodes
+ def select(self, node):
+ return self
def CScanner():
"""Return a prototype Scanner instance for scanning source files
that use the C pre-processor"""
+
+ # Here's how we would (or might) use the CPP scanner code above that
+ # knows how to evaluate #if/#ifdef/#else/#elif lines when searching
+ # for #includes. This is commented out for now until we add the
+ # right configurability to let users pick between the scanners.
+ #return SConsCPPScannerWrapper("CScanner", "CPPPATH")
+
cs = SCons.Scanner.ClassicCPP("CScanner",
"$CPPSUFFIXES",
"CPPPATH",
diff --git a/src/engine/SCons/Scanner/D.py b/src/engine/SCons/Scanner/D.py
index 5a0b383..bfbcd5d 100644
--- a/src/engine/SCons/Scanner/D.py
+++ b/src/engine/SCons/Scanner/D.py
@@ -32,22 +32,37 @@ Coded by Andy Friesen
__revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
+import re
import string
import SCons.Scanner
def DScanner():
"""Return a prototype Scanner instance for scanning D source files"""
- ds = D(name = "DScanner",
- suffixes = '$DSUFFIXES',
- path_variable = 'DPATH',
- regex = 'import\s+([^\;]*)\;')
+ ds = D()
return ds
class D(SCons.Scanner.Classic):
+ def __init__ (self):
+ SCons.Scanner.Classic.__init__ (self,
+ name = "DScanner",
+ suffixes = '$DSUFFIXES',
+ path_variable = 'DPATH',
+ regex = 'import\s+(?:[a-zA-Z0-9_.]+)\s*(?:,\s*(?:[a-zA-Z0-9_.]+)\s*)*;')
+
+ self.cre2 = re.compile ('(?:import\s)?\s*([a-zA-Z0-9_.]+)\s*(?:,|;)', re.M)
+
def find_include(self, include, source_dir, path):
# translate dots (package separators) to slashes
inc = string.replace(include, '.', '/')
i = SCons.Node.FS.find_file(inc + '.d', (source_dir,) + path)
+ if i is None:
+ i = SCons.Node.FS.find_file (inc + '.di', (source_dir,) + path)
return i, include
+
+ def find_include_names(self, node):
+ includes = []
+ for i in self.cre.findall(node.get_contents()):
+ includes = includes + self.cre2.findall(i)
+ return includes
diff --git a/src/engine/SCons/Scanner/LaTeX.py b/src/engine/SCons/Scanner/LaTeX.py
index c0a38b5..ceb9bf5 100644
--- a/src/engine/SCons/Scanner/LaTeX.py
+++ b/src/engine/SCons/Scanner/LaTeX.py
@@ -56,7 +56,7 @@ class LaTeX(SCons.Scanner.Classic):
but leave the file name untouched for "includegraphics." For
the "bibliography" keyword we need to add .bib if there is
no extension. (This need to be revisited since if there
- is no extension for an :includegraphics" keyword latex will
+ is no extension for an "includegraphics" keyword latex will
append .ps or .eps to find the file; while pdftex will use
other extensions.)
"""
diff --git a/src/engine/SCons/Scanner/ScannerTests.py b/src/engine/SCons/Scanner/ScannerTests.py
index 64d6d77..6e9286a 100644
--- a/src/engine/SCons/Scanner/ScannerTests.py
+++ b/src/engine/SCons/Scanner/ScannerTests.py
@@ -70,9 +70,12 @@ class FindPathDirsTestCase(unittest.TestCase):
env = DummyEnvironment(LIBPATH = [ 'foo' ])
env.fs = DummyFS()
+ env.fs._cwd = DummyNode('cwd')
dir = DummyNode('dir', ['xxx'])
fpd = SCons.Scanner.FindPathDirs('LIBPATH')
+ result = fpd(env)
+ assert str(result) == "('foo',)", result
result = fpd(env, dir)
assert str(result) == "('xxx', 'foo')", result
diff --git a/src/engine/SCons/Scanner/__init__.py b/src/engine/SCons/Scanner/__init__.py
index c8ab155..924b271 100644
--- a/src/engine/SCons/Scanner/__init__.py
+++ b/src/engine/SCons/Scanner/__init__.py
@@ -67,7 +67,7 @@ class FindPathDirs:
will return all of the *path directories."""
def __init__(self, variable):
self.variable = variable
- def __call__(self, env, dir, target=None, source=None, argument=None):
+ def __call__(self, env, dir=None, target=None, source=None, argument=None):
import SCons.PathList
try:
path = env[self.variable]
@@ -346,13 +346,16 @@ class Classic(Current):
def sort_key(self, include):
return SCons.Node.FS._my_normcase(include)
+ def find_include_names(self, node):
+ return self.cre.findall(node.get_contents())
+
def scan(self, node, path=()):
# cache the includes list in node so we only scan it once:
if node.includes != None:
includes = node.includes
else:
- includes = self.cre.findall(node.get_contents())
+ includes = self.find_include_names (node)
node.includes = includes
# This is a hand-coded DSU (decorate-sort-undecorate, or
diff --git a/src/engine/SCons/Script/Interactive.py b/src/engine/SCons/Script/Interactive.py
new file mode 100644
index 0000000..e38c400
--- /dev/null
+++ b/src/engine/SCons/Script/Interactive.py
@@ -0,0 +1,359 @@
+#
+# __COPYRIGHT__
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY
+# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
+# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
+# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
+# WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+#
+
+__revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
+
+__doc__ = """
+SCons interactive mode
+"""
+
+# TODO:
+#
+# This has the potential to grow into something with a really big life
+# of its own, which might or might not be a good thing. Nevertheless,
+# here are some enhancements that will probably be requested some day
+# and are worth keeping in mind (assuming this takes off):
+#
+# - A command to re-read / re-load the SConscript files. This may
+# involve allowing people to specify command-line options (e.g. -f,
+# -I, --no-site-dir) that affect how the SConscript files are read.
+#
+# - Additional command-line options on the "build" command.
+#
+# Of the supported options that seemed to make sense (after a quick
+# pass through the list), the ones that seemed likely enough to be
+# used are listed in the man page and have explicit test scripts.
+#
+# These had code changed in Script/Main.py to support them, but didn't
+# seem likely to be used regularly, so had no test scripts added:
+#
+# build --diskcheck=*
+# build --implicit-cache=*
+# build --implicit-deps-changed=*
+# build --implicit-deps-unchanged=*
+#
+# These look like they should "just work" with no changes to the
+# existing code, but like those above, look unlikely to be used and
+# therefore had no test scripts added:
+#
+# build --random
+#
+# These I'm not sure about. They might be useful for individual
+# "build" commands, and may even work, but they seem unlikely enough
+# that we'll wait until they're requested before spending any time on
+# writing test scripts for them, or investigating whether they work.
+#
+# build -q [??? is there a useful analog to the exit status?]
+# build --duplicate=
+# build --profile=
+# build --max-drift=
+# build --warn=*
+# build --Y
+#
+# - Most of the SCons command-line options that the "build" command
+# supports should be settable as default options that apply to all
+# subsequent "build" commands. Maybe a "set {option}" command that
+# maps to "SetOption('{option}')".
+#
+# - Need something in the 'help' command that prints the -h output.
+#
+# - A command to run the configure subsystem separately (must see how
+# this interacts with the new automake model).
+#
+# - Command-line completion of target names; maybe even of SCons options?
+# Completion is something that's supported by the Python cmd module,
+# so this should be doable without too much trouble.
+#
+
+import cmd
+import copy
+import os
+import re
+import shlex
+import string
+import sys
+
+try:
+ import readline
+except ImportError:
+ pass
+
+from SCons.Debug import Trace
+
+class SConsInteractiveCmd(cmd.Cmd):
+ """\
+ build [TARGETS] Build the specified TARGETS and their dependencies.
+ 'b' is a synonym.
+ clean [TARGETS] Clean (remove) the specified TARGETS and their
+ dependencies. 'c' is a synonym.
+ exit Exit SCons interactive mode.
+ help [COMMAND] Prints help for the specified COMMAND. 'h' and
+ '?' are synonyms.
+ shell [COMMANDLINE] Execute COMMANDLINE in a subshell. 'sh' and '!'
+ are synonyms.
+ version Prints SCons version information.
+ """
+
+ synonyms = {
+ 'b' : 'build',
+ 'c' : 'clean',
+ 'h' : 'help',
+ 'scons' : 'build',
+ 'sh' : 'shell',
+ }
+
+ def __init__(self, **kw):
+ cmd.Cmd.__init__(self)
+ for key, val in kw.items():
+ setattr(self, key, val)
+
+ if sys.platform == 'win32':
+ self.shell_variable = 'COMSPEC'
+ else:
+ self.shell_variable = 'SHELL'
+
+ def default(self, argv):
+ print "*** Unknown command: %s" % argv[0]
+
+ def onecmd(self, line):
+ line = string.strip(line)
+ if not line:
+ print self.lastcmd
+ return self.emptyline()
+ self.lastcmd = line
+ if line[0] == '!':
+ line = 'shell ' + line[1:]
+ elif line[0] == '?':
+ line = 'help ' + line[1:]
+ argv = shlex.split(line)
+ argv[0] = self.synonyms.get(argv[0], argv[0])
+ if not argv[0]:
+ return self.default(line)
+ else:
+ try:
+ func = getattr(self, 'do_' + argv[0])
+ except AttributeError:
+ return self.default(argv)
+ return func(argv)
+
+ def do_build(self, argv):
+ """\
+ build [TARGETS] Build the specified TARGETS and their
+ dependencies. 'b' is a synonym.
+ """
+ import SCons.SConsign
+ import SCons.Script.Main
+
+ options = copy.deepcopy(self.options)
+
+ options, targets = self.parser.parse_args(argv[1:], values=options)
+
+ SCons.Script.COMMAND_LINE_TARGETS = targets
+
+ if targets:
+ SCons.Script.BUILD_TARGETS = targets
+ else:
+ # If the user didn't specify any targets on the command line,
+ # use the list of default targets.
+ SCons.Script.BUILD_TARGETS = SCons.Script._build_plus_default
+
+ nodes = SCons.Script.Main._build_targets(self.fs,
+ options,
+ targets,
+ self.target_top)
+
+ if not nodes:
+ return
+
+ # Clean up so that we can perform the next build correctly.
+ #
+ # We do this by walking over all the children of the targets,
+ # and clearing their state.
+ #
+ # We currently have to re-scan each node to find their
+ # children, because built nodes have already been partially
+ # cleared and don't remember their children. (In scons
+ # 0.96.1 and earlier, this wasn't the case, and we didn't
+ # have to re-scan the nodes.)
+ #
+ # Because we have to re-scan each node, we can't clear the
+ # nodes as we walk over them, because we may end up rescanning
+ # a cleared node as we scan a later node. Therefore, only
+ # store the list of nodes that need to be cleared as we walk
+ # the tree, and clear them in a separate pass.
+ #
+ # XXX: Someone more familiar with the inner workings of scons
+ # may be able to point out a more efficient way to do this.
+
+ SCons.Script.Main.progress_display("scons: Clearing cached node information ...")
+
+ seen_nodes = {}
+
+ def get_unseen_children(node, parent, seen_nodes=seen_nodes):
+ def is_unseen(node, seen_nodes=seen_nodes):
+ return not seen_nodes.has_key(node)
+ return filter(is_unseen, node.children(scan=1))
+
+ def add_to_seen_nodes(node, parent, seen_nodes=seen_nodes):
+ seen_nodes[node] = 1
+
+ # If this file is in a BuildDir and has a
+ # corresponding source file in the source tree, remember the
+ # node in the source tree, too. This is needed in
+ # particular to clear cached implicit dependencies on the
+ # source file, since the scanner will scan it if the
+ # BuildDir was created with duplicate=0.
+ try:
+ rfile_method = node.rfile
+ except AttributeError:
+ return
+ else:
+ rfile = rfile_method()
+ if rfile != node:
+ seen_nodes[rfile] = 1
+
+ for node in nodes:
+ walker = SCons.Node.Walker(node,
+ kids_func=get_unseen_children,
+ eval_func=add_to_seen_nodes)
+ n = walker.next()
+ while n:
+ n = walker.next()
+
+ for node in seen_nodes.keys():
+ # Call node.clear() to clear most of the state
+ node.clear()
+ # node.clear() doesn't reset node.state, so call
+ # node.set_state() to reset it manually
+ node.set_state(SCons.Node.no_state)
+ node.implicit = None
+
+ SCons.SConsign.Reset()
+ SCons.Script.Main.progress_display("scons: done clearing node information.")
+
+ def do_clean(self, argv):
+ """\
+ clean [TARGETS] Clean (remove) the specified TARGETS
+ and their dependencies. 'c' is a synonym.
+ """
+ return self.do_build(['build', '--clean'] + argv[1:])
+
+ def do_EOF(self, argv):
+ print
+ self.do_exit(argv)
+
+ def _do_one_help(self, arg):
+ try:
+ # If help_<arg>() exists, then call it.
+ func = getattr(self, 'help_' + arg)
+ except AttributeError:
+ try:
+ func = getattr(self, 'do_' + arg)
+ except AttributeError:
+ doc = None
+ else:
+ doc = self._doc_to_help(func)
+ if doc:
+ sys.stdout.write(doc + '\n')
+ sys.stdout.flush()
+ else:
+ doc = self.strip_initial_spaces(func())
+ if doc:
+ sys.stdout.write(doc + '\n')
+ sys.stdout.flush()
+
+ def _doc_to_help(self, obj):
+ doc = obj.__doc__
+ if doc is None:
+ return ''
+ return self._strip_initial_spaces(doc)
+
+ def _strip_initial_spaces(self, s):
+ #lines = s.split('\n')
+ lines = string.split(s, '\n')
+ spaces = re.match(' *', lines[0]).group(0)
+ #def strip_spaces(l):
+ # if l.startswith(spaces):
+ # l = l[len(spaces):]
+ # return l
+ #return '\n'.join([ strip_spaces(l) for l in lines ])
+ def strip_spaces(l, spaces=spaces):
+ if l[:len(spaces)] == spaces:
+ l = l[len(spaces):]
+ return l
+ lines = map(strip_spaces, lines)
+ return string.join(lines, '\n')
+
+ def do_exit(self, argv):
+ """\
+ exit Exit SCons interactive mode.
+ """
+ sys.exit(0)
+
+ def do_help(self, argv):
+ """\
+ help [COMMAND] Prints help for the specified COMMAND. 'h'
+ and '?' are synonyms.
+ """
+ if argv[1:]:
+ for arg in argv[1:]:
+ if self._do_one_help(arg):
+ break
+ else:
+ # If bare 'help' is called, print this class's doc
+ # string (if it has one).
+ doc = self._doc_to_help(self.__class__)
+ if doc:
+ sys.stdout.write(doc + '\n')
+ sys.stdout.flush()
+
+ def do_shell(self, argv):
+ """\
+ shell [COMMANDLINE] Execute COMMANDLINE in a subshell. 'sh' and
+ '!' are synonyms.
+ """
+ import subprocess
+ argv = argv[1:]
+ if not argv:
+ argv = os.environ[self.shell_variable]
+ try:
+ p = subprocess.Popen(argv)
+ except EnvironmentError, e:
+ sys.stderr.write('scons: %s: %s\n' % (argv[0], e.strerror))
+ else:
+ p.wait()
+
+ def do_version(self, argv):
+ """\
+ version Prints SCons version information.
+ """
+ sys.stdout.write(self.parser.version + '\n')
+
+def interact(fs, parser, options, targets, target_top):
+ c = SConsInteractiveCmd(prompt = 'scons>>> ',
+ fs = fs,
+ parser = parser,
+ options = options,
+ targets = targets,
+ target_top = target_top)
+ c.cmdloop()
diff --git a/src/engine/SCons/Script/Main.py b/src/engine/SCons/Script/Main.py
index 97e0b19..bcbd0a1 100644
--- a/src/engine/SCons/Script/Main.py
+++ b/src/engine/SCons/Script/Main.py
@@ -68,6 +68,18 @@ import SCons.Taskmaster
import SCons.Util
import SCons.Warnings
+import SCons.Script.Interactive
+
+def fetch_win32_parallel_msg():
+ # A subsidiary function that exists solely to isolate this import
+ # so we don't have to pull it in on all platforms, and so that an
+ # in-line "import" statement in the _main() function below doesn't
+ # cause warnings about local names shadowing use of the 'SCons'
+ # globl in nest scopes and UnboundLocalErrors and the like in some
+ # versions (2.1) of Python.
+ import SCons.Platform.win32
+ SCons.Platform.win32.parallel_msg
+
#
class SConsPrintHelpException(Exception):
@@ -730,7 +742,6 @@ def version_string(label, module):
module.__buildsys__)
def _main(parser):
- import SCons
global exit_status
options = parser.values
@@ -750,7 +761,8 @@ def _main(parser):
SCons.Warnings.NoMetaclassSupportWarning,
SCons.Warnings.NoObjectCountWarning,
SCons.Warnings.NoParallelSupportWarning,
- SCons.Warnings.MisleadingKeywordsWarning, ]
+ SCons.Warnings.MisleadingKeywordsWarning,
+ SCons.Warnings.StackSizeWarning, ]
for warning in default_warnings:
SCons.Warnings.enableWarningClass(warning)
SCons.Warnings._warningOut = _scons_internal_warning
@@ -835,10 +847,10 @@ def _main(parser):
SCons.Node.implicit_cache = options.implicit_cache
SCons.Node.implicit_deps_changed = options.implicit_deps_changed
SCons.Node.implicit_deps_unchanged = options.implicit_deps_unchanged
+
if options.no_exec:
SCons.SConf.dryrun = 1
SCons.Action.execute_actions = None
- CleanTask.execute = CleanTask.show
if options.question:
SCons.SConf.dryrun = 1
if options.clean:
@@ -850,19 +862,6 @@ def _main(parser):
if options.no_progress or options.silent:
progress_display.set_mode(0)
- if options.silent:
- display.set_mode(0)
- if options.silent:
- SCons.Action.print_actions = None
-
- if options.cache_disable:
- SCons.CacheDir.CacheDir = SCons.Util.Null()
- if options.cache_debug:
- SCons.CacheDir.cache_debug = options.cache_debug
- if options.cache_force:
- SCons.CacheDir.cache_force = True
- if options.cache_show:
- SCons.CacheDir.cache_show = True
if options.site_dir:
_load_site_scons_dir(d, options.site_dir)
@@ -887,7 +886,18 @@ def _main(parser):
SCons.Script._Add_Targets(targets + parser.rargs)
SCons.Script._Add_Arguments(xmit_args)
- sys.stdout = SCons.Util.Unbuffered(sys.stdout)
+ # If stdout is not a tty, replace it with a wrapper object to call flush
+ # after every write.
+ #
+ # Tty devices automatically flush after every newline, so the replacement
+ # isn't necessary. Furthermore, if we replace sys.stdout, the readline
+ # module will no longer work. This affects the behavior during
+ # --interactive mode. --interactive should only be used when stdin and
+ # stdout refer to a tty.
+ if not sys.stdout.isatty():
+ sys.stdout = SCons.Util.Unbuffered(sys.stdout)
+ if not sys.stderr.isatty():
+ sys.stderr = SCons.Util.Unbuffered(sys.stderr)
memory_stats.append('before reading SConscript files:')
count_stats.append(('pre-', 'read'))
@@ -956,6 +966,47 @@ def _main(parser):
SCons.Node.implicit_cache = options.implicit_cache
SCons.Node.FS.set_duplicate(options.duplicate)
fs.set_max_drift(options.max_drift)
+ if not options.stack_size is None:
+ SCons.Job.stack_size = options.stack_size
+
+ platform = SCons.Platform.platform_module()
+
+ if options.interactive:
+ SCons.Script.Interactive.interact(fs, OptionsParser, options,
+ targets, target_top)
+
+ else:
+
+ # Build the targets
+ nodes = _build_targets(fs, options, targets, target_top)
+ if not nodes:
+ exit_status = 2
+
+def _build_targets(fs, options, targets, target_top):
+
+ progress_display.set_mode(not (options.no_progress or options.silent))
+ display.set_mode(not options.silent)
+ SCons.Action.print_actions = not options.silent
+ SCons.Action.execute_actions = not options.no_exec
+ SCons.SConf.dryrun = options.no_exec
+
+ if options.diskcheck:
+ SCons.Node.FS.set_diskcheck(options.diskcheck)
+
+ _set_debug_values(options)
+ SCons.Node.implicit_cache = options.implicit_cache
+ SCons.Node.implicit_deps_changed = options.implicit_deps_changed
+ SCons.Node.implicit_deps_unchanged = options.implicit_deps_unchanged
+
+ SCons.CacheDir.cache_enabled = not options.cache_disable
+ SCons.CacheDir.cache_debug = options.cache_debug
+ SCons.CacheDir.cache_force = options.cache_force
+ SCons.CacheDir.cache_show = options.cache_show
+
+ if options.no_exec:
+ CleanTask.execute = CleanTask.show
+ else:
+ CleanTask.execute = CleanTask.remove
lookup_top = None
if targets or SCons.Script.BUILD_TARGETS != SCons.Script._build_plus_default:
@@ -1003,7 +1054,7 @@ def _main(parser):
if not targets:
sys.stderr.write("scons: *** No targets specified and no Default() targets found. Stop.\n")
- sys.exit(2)
+ return None
def Entry(x, ltop=lookup_top, ttop=target_top, fs=fs):
if isinstance(x, SCons.Node.Node):
@@ -1046,7 +1097,7 @@ def _main(parser):
opening_message = "Cleaning targets ..."
closing_message = "done cleaning targets."
if options.keep_going:
- closing_message = "done cleaning targets (errors occurred during clean)."
+ failure_message = "done cleaning targets (errors occurred during clean)."
else:
failure_message = "cleaning terminated because of errors."
except AttributeError:
@@ -1091,8 +1142,7 @@ def _main(parser):
msg = "parallel builds are unsupported by this version of Python;\n" + \
"\tignoring -j or num_jobs option.\n"
elif sys.platform == 'win32':
- import SCons.Platform.win32
- msg = SCons.Platform.win32.parallel_msg
+ msg = fetch_win32_parallel_msg()
if msg:
SCons.Warnings.warn(SCons.Warnings.NoParallelSupportWarning, msg)
@@ -1101,7 +1151,15 @@ def _main(parser):
try:
progress_display("scons: " + opening_message)
- jobs.run()
+ try:
+ jobs.run()
+ except KeyboardInterrupt:
+ # If we are in interactive mode, a KeyboardInterrupt
+ # interrupts only this current run. Return 'nodes' normally
+ # so that the outer loop can clean up the nodes and continue.
+ if options.interactive:
+ print "Build interrupted."
+ # Continue and return normally
finally:
jobs.cleanup()
if exit_status:
@@ -1114,6 +1172,8 @@ def _main(parser):
memory_stats.append('after building targets:')
count_stats.append(('post-', 'build'))
+ return nodes
+
def _exec_main(parser, values):
sconsflags = os.environ.get('SCONSFLAGS', '')
all_args = string.split(sconsflags) + sys.argv[1:]
diff --git a/src/engine/SCons/Script/SConsOptions.py b/src/engine/SCons/Script/SConsOptions.py
index 46ece27..8f7116d 100644
--- a/src/engine/SCons/Script/SConsOptions.py
+++ b/src/engine/SCons/Script/SConsOptions.py
@@ -122,6 +122,7 @@ class SConsValues(optparse.Values):
'no_exec',
'num_jobs',
'random',
+ 'stack_size',
]
def set_option(self, name, value):
@@ -163,6 +164,11 @@ class SConsValues(optparse.Values):
# Set this right away so it can affect the rest of the
# file/Node lookups while processing the SConscript files.
SCons.Node.FS.set_diskcheck(value)
+ elif name == 'stack_size':
+ try:
+ value = int(value)
+ except ValueError:
+ raise SCons.Errors.UserError, "An integer is required: %s"%repr(value)
self.__SConscript_settings__[name] = value
@@ -466,6 +472,7 @@ def Parser(version):
usage="usage: scons [OPTION] [TARGET] ...",)
op.preserve_unknown_options = True
+ op.version = version
# Add the options to the parser we just created.
#
@@ -667,6 +674,11 @@ def Parser(version):
action="callback", callback=opt_implicit_deps,
help="Ignore changes in implicit dependencies.")
+ op.add_option('--interact', '--interactive',
+ dest='interactive', default=False,
+ action="store_true",
+ help="Run in interactive mode.")
+
op.add_option('-j', '--jobs',
nargs=1, type="int",
dest="num_jobs", default=1,
@@ -730,6 +742,13 @@ def Parser(version):
help="Use DIR instead of the usual site_scons dir.",
metavar="DIR")
+ op.add_option('--stack-size',
+ nargs=1, type="int",
+ dest='stack_size',
+ action="store",
+ help="Set the stack size of the threads used to run jobs to N kilobytes.",
+ metavar="N")
+
op.add_option('--taskmastertrace',
nargs=1,
dest="taskmastertrace_file", default=None,
@@ -777,8 +796,8 @@ def Parser(version):
help="Search up directory tree for SConstruct, "
"build Default() targets from local SConscript.")
- def opt_version(option, opt, value, parser, version=version):
- sys.stdout.write(version + '\n')
+ def opt_version(option, opt, value, parser):
+ sys.stdout.write(parser.version + '\n')
sys.exit(0)
op.add_option("-v", "--version",
action="callback", callback=opt_version,
diff --git a/src/engine/SCons/Subst.py b/src/engine/SCons/Subst.py
index 989f1dd..7a565ba 100644
--- a/src/engine/SCons/Subst.py
+++ b/src/engine/SCons/Subst.py
@@ -39,11 +39,11 @@ import UserString
import SCons.Errors
-from SCons.Util import is_String, is_List, is_Tuple
+from SCons.Util import is_String, is_Sequence
# Indexed by the SUBST_* constants below.
-_strconv = [SCons.Util.to_String,
- SCons.Util.to_String,
+_strconv = [SCons.Util.to_String_for_subst,
+ SCons.Util.to_String_for_subst,
SCons.Util.to_String_for_signature]
@@ -188,7 +188,7 @@ class NLWrapper:
list = self.list
if list is None:
list = []
- elif not is_List(list) and not is_Tuple(list):
+ elif not is_Sequence(list):
list = [list]
# The map(self.func) call is what actually turns
# a list into appropriate proxies.
@@ -203,10 +203,10 @@ class Targets_or_Sources(UserList.UserList):
wrapping a NLWrapper. This class handles the different methods used
to access the list, calling the NLWrapper to create proxies on demand.
- Note that we subclass UserList.UserList purely so that the is_List()
- function will identify an object of this class as a list during
- variable expansion. We're not really using any UserList.UserList
- methods in practice.
+ Note that we subclass UserList.UserList purely so that the
+ is_Sequence() function will identify an object of this class as
+ a list during variable expansion. We're not really using any
+ UserList.UserList methods in practice.
"""
def __init__(self, nl):
self.nl = nl
@@ -312,6 +312,25 @@ _remove = re.compile(r'\$\([^\$]*(\$[^\)][^\$]*)*\$\)')
# Indexed by the SUBST_* constants above.
_regex_remove = [ _rm, None, _remove ]
+def _rm_list(list):
+ #return [ l for l in list if not l in ('$(', '$)') ]
+ return filter(lambda l: not l in ('$(', '$)'), list)
+
+def _remove_list(list):
+ result = []
+ do_append = result.append
+ for l in list:
+ if l == '$(':
+ do_append = lambda x: None
+ elif l == '$)':
+ do_append = result.append
+ else:
+ do_append(l)
+ return result
+
+# Indexed by the SUBST_* constants above.
+_list_remove = [ _rm_list, None, _remove_list ]
+
# Regular expressions for splitting strings and handling substitutions,
# for use by the scons_subst() and scons_subst_list() functions:
#
@@ -342,7 +361,8 @@ _separate_args = re.compile(r'(%s|\s+|[^\s\$]+|\$)' % _dollar_exps_str)
_space_sep = re.compile(r'[\t ]+(?![^{]*})')
def scons_subst(strSubst, env, mode=SUBST_RAW, target=None, source=None, gvars={}, lvars={}, conv=None):
- """Expand a string containing construction variable substitutions.
+ """Expand a string or list containing construction variable
+ substitutions.
This is the work-horse function for substitutions in file names
and the like. The companion scons_subst_list() function (below)
@@ -427,11 +447,10 @@ def scons_subst(strSubst, env, mode=SUBST_RAW, target=None, source=None, gvars={
var = string.split(key, '.')[0]
lv[var] = ''
return self.substitute(s, lv)
- elif is_List(s) or is_Tuple(s):
+ elif is_Sequence(s):
def func(l, conv=self.conv, substitute=self.substitute, lvars=lvars):
return conv(substitute(l, lvars))
- r = map(func, s)
- return string.join(r)
+ return map(func, s)
elif callable(s):
try:
s = s(target=self.target,
@@ -458,6 +477,7 @@ def scons_subst(strSubst, env, mode=SUBST_RAW, target=None, source=None, gvars={
separate tokens.
"""
if is_String(args) and not isinstance(args, CmdStringHolder):
+ args = str(args) # In case it's a UserString.
try:
def sub_match(match, conv=self.conv, expand=self.expand, lvars=lvars):
return conv(expand(match.group(1), lvars))
@@ -472,11 +492,10 @@ def scons_subst(strSubst, env, mode=SUBST_RAW, target=None, source=None, gvars={
result = []
for a in args:
result.append(self.conv(self.expand(a, lvars)))
- try:
- result = string.join(result, '')
- except TypeError:
- if len(result) == 1:
- result = result[0]
+ if len(result) == 1:
+ result = result[0]
+ else:
+ result = string.join(map(str, result), '')
return result
else:
return self.expand(args, lvars)
@@ -524,6 +543,10 @@ def scons_subst(strSubst, env, mode=SUBST_RAW, target=None, source=None, gvars={
# Compress strings of white space characters into
# a single space.
result = string.strip(_space_sep.sub(' ', result))
+ elif is_Sequence(result):
+ remove = _list_remove[mode]
+ if remove:
+ result = remove(result)
return result
@@ -634,7 +657,7 @@ def scons_subst_list(strSubst, env, mode=SUBST_RAW, target=None, source=None, gv
lv[var] = ''
self.substitute(s, lv, 0)
self.this_word()
- elif is_List(s) or is_Tuple(s):
+ elif is_Sequence(s):
for a in s:
self.substitute(a, lvars, 1)
self.next_word()
@@ -666,6 +689,7 @@ def scons_subst_list(strSubst, env, mode=SUBST_RAW, target=None, source=None, gv
"""
if is_String(args) and not isinstance(args, CmdStringHolder):
+ args = str(args) # In case it's a UserString.
args = _separate_args.findall(args)
for a in args:
if a[0] in ' \t\n\r\f\v':
@@ -827,18 +851,18 @@ def scons_subst_once(strSubst, env, key):
a = match.group(1)
if a in matchlist:
a = val
- if is_List(a) or is_Tuple(a):
+ if is_Sequence(a):
return string.join(map(str, a))
else:
return str(a)
- if is_List(strSubst) or is_Tuple(strSubst):
+ if is_Sequence(strSubst):
result = []
for arg in strSubst:
if is_String(arg):
if arg in matchlist:
arg = val
- if is_List(arg) or is_Tuple(arg):
+ if is_Sequence(arg):
result.extend(arg)
else:
result.append(arg)
diff --git a/src/engine/SCons/SubstTests.py b/src/engine/SCons/SubstTests.py
index b6e5b71..c064164 100644
--- a/src/engine/SCons/SubstTests.py
+++ b/src/engine/SCons/SubstTests.py
@@ -190,6 +190,7 @@ class SubstTestCase(unittest.TestCase):
'T' : ('x', 'y'),
'CS' : cs,
'CL' : cl,
+ 'US' : UserString.UserString('us'),
# Test function calls within ${}.
'FUNCCALL' : '${FUNC1("$AAA $FUNC2 $BBB")}',
@@ -317,6 +318,12 @@ class SubstTestCase(unittest.TestCase):
'$CS', 'cs',
'$CL', 'cl',
+ # Various uses of UserString.
+ UserString.UserString('x'), 'x',
+ UserString.UserString('$X'), 'x',
+ UserString.UserString('$US'), 'us',
+ '$US', 'us',
+
# Test function calls within ${}.
'$FUNCCALL', 'a xc b',
@@ -404,9 +411,9 @@ class SubstTestCase(unittest.TestCase):
"This is test",
["|", "$(", "$AAA", "|", "$BBB", "$)", "|", "$CCC", 1],
- "| $( a | b $) | c 1",
- "| a | b | c 1",
- "| | c 1",
+ ["|", "$(", "a", "|", "b", "$)", "|", "c", "1"],
+ ["|", "a", "|", "b", "|", "c", "1"],
+ ["|", "|", "c", "1"],
]
gvars = env.Dictionary()
@@ -570,7 +577,7 @@ class SubstTestCase(unittest.TestCase):
cmd = SCons.Util.CLVar("test $FOO $BAR $CALL test")
newcmd = scons_subst(cmd, env, gvars=env.Dictionary())
- assert newcmd == 'test foo bar call test', newcmd
+ assert newcmd == ['test', 'foo', 'bar', 'call', 'test'], newcmd
cmd_list = scons_subst_list(cmd, env, gvars=env.Dictionary())
assert len(cmd_list) == 1, cmd_list
@@ -653,6 +660,7 @@ class SubstTestCase(unittest.TestCase):
'L' : ['x', 'y'],
'CS' : cs,
'CL' : cl,
+ 'US' : UserString.UserString('us'),
# Test function calls within ${}.
'FUNCCALL' : '${FUNC1("$AAA $FUNC2 $BBB")}',
@@ -786,6 +794,16 @@ class SubstTestCase(unittest.TestCase):
'$CL', [['cl']],
['$CL'], [['cl']],
+ # Various uses of UserString.
+ UserString.UserString('x'), [['x']],
+ [UserString.UserString('x')], [['x']],
+ UserString.UserString('$X'), [['x']],
+ [UserString.UserString('$X')], [['x']],
+ UserString.UserString('$US'), [['us']],
+ [UserString.UserString('$US')], [['us']],
+ '$US', [['us']],
+ ['$US'], [['us']],
+
# Test function calls within ${}.
'$FUNCCALL', [['a', 'xc', 'b']],
diff --git a/src/engine/SCons/Taskmaster.py b/src/engine/SCons/Taskmaster.py
index 3bb4225..9db8138 100644
--- a/src/engine/SCons/Taskmaster.py
+++ b/src/engine/SCons/Taskmaster.py
@@ -562,21 +562,6 @@ class Taskmaster:
childstate = map(lambda N: (N, N.get_state()), children)
- # Skip this node if any of its children have failed. This
- # catches the case where we're descending a top-level target
- # and one of our children failed while trying to be built
- # by a *previous* descent of an earlier top-level target.
- failed_children = filter(lambda I: I[1] == SCons.Node.failed,
- childstate)
- if failed_children:
- node.set_state(SCons.Node.failed)
- if S: S.child_failed = S.child_failed + 1
- if T:
- c = map(str, failed_children)
- c.sort()
- T.write(' children failed:\n %s\n' % c)
- continue
-
# Detect dependency cycles:
pending_nodes = filter(lambda I: I[1] == SCons.Node.pending, childstate)
if pending_nodes:
@@ -632,6 +617,35 @@ class Taskmaster:
T.write(' waiting on side effects:\n %s\n' % c)
continue
+ # Skip this node if any of its children have failed.
+ #
+ # This catches the case where we're descending a top-level
+ # target and one of our children failed while trying to be
+ # built by a *previous* descent of an earlier top-level
+ # target.
+ #
+ # It can also occur if a node is reused in multiple
+ # targets. One first descends though the one of the
+ # target, the next time occurs through the other target.
+ #
+ # Note that we can only have failed_children if the
+ # --keep-going flag was used, because without it the build
+ # will stop before diving in the other branch.
+ #
+ # Note that even if one of the children fails, we still
+ # added the other children to the list of candidate nodes
+ # to keep on building (--keep-going).
+ failed_children = filter(lambda I: I[1] == SCons.Node.failed,
+ childstate)
+ if failed_children:
+ node.set_state(SCons.Node.failed)
+ if S: S.child_failed = S.child_failed + 1
+ if T:
+ c = map(lambda I: str(I[0]), failed_children)
+ c.sort()
+ T.write(' children failed:\n %s\n' % c)
+ continue
+
# The default when we've gotten through all of the checks above:
# this node is ready to be built.
if S: S.build = S.build + 1
diff --git a/src/engine/SCons/Tool/applelink.py b/src/engine/SCons/Tool/applelink.py
index a65a4af..532301f 100644
--- a/src/engine/SCons/Tool/applelink.py
+++ b/src/engine/SCons/Tool/applelink.py
@@ -35,12 +35,14 @@ __revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
import SCons.Util
-import gnulink
+# Even though the Mac is based on the GNU toolchain, it doesn't understand
+# the -rpath option, so we use the "link" tool instead of "gnulink".
+import link
def generate(env):
"""Add Builders and construction variables for applelink to an
Environment."""
- gnulink.generate(env)
+ link.generate(env)
env['FRAMEWORKPATHPREFIX'] = '-F'
env['_FRAMEWORKPATH'] = '${_concat(FRAMEWORKPATHPREFIX, FRAMEWORKPATH, "", __env__)}'
diff --git a/src/engine/SCons/Tool/gfortran.py b/src/engine/SCons/Tool/gfortran.py
new file mode 100644
index 0000000..f3db693
--- /dev/null
+++ b/src/engine/SCons/Tool/gfortran.py
@@ -0,0 +1,62 @@
+"""SCons.Tool.gfortran
+
+Tool-specific initialization for gfortran, the GNU Fortran 95/Fortran
+2003 compiler.
+
+There normally shouldn't be any need to import this module directly.
+It will usually be imported through the generic SCons.Tool.Tool()
+selection method.
+
+"""
+
+#
+# __COPYRIGHT__
+#
+# Permission is hereby granted, free of charge, to any person obtaining
+# a copy of this software and associated documentation files (the
+# "Software"), to deal in the Software without restriction, including
+# without limitation the rights to use, copy, modify, merge, publish,
+# distribute, sublicense, and/or sell copies of the Software, and to
+# permit persons to whom the Software is furnished to do so, subject to
+# the following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY
+# KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE
+# WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
+# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+# OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
+# WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+#
+
+__revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
+
+import SCons.Util
+
+import fortran
+
+def generate(env):
+ """Add Builders and construction variables for gfortran to an
+ Environment."""
+ fortran.generate(env)
+
+ # which one is the good one ? ifort uses _FORTRAND, ifl FORTRAN,
+ # aixf77 F77 ...
+ #env['_FORTRAND'] = 'gfortran'
+ env['FORTRAN'] = 'gfortran'
+
+ # XXX does this need to be set too ?
+ #env['SHFORTRAN'] = 'gfortran'
+
+ if env['PLATFORM'] in ['cygwin', 'win32']:
+ env['SHFORTRANFLAGS'] = SCons.Util.CLVar('$FORTRANFLAGS')
+ else:
+ env['SHFORTRANFLAGS'] = SCons.Util.CLVar('$FORTRANFLAGS -fPIC')
+
+ # XXX; Link problems: we need to add -lgfortran somewhere...
+
+def exists(env):
+ return env.Detect('gfortran')
diff --git a/src/engine/SCons/Tool/gfortran.xml b/src/engine/SCons/Tool/gfortran.xml
new file mode 100644
index 0000000..ba0fe76
--- /dev/null
+++ b/src/engine/SCons/Tool/gfortran.xml
@@ -0,0 +1,15 @@
+<!--
+__COPYRIGHT__
+
+This file is processed by the bin/SConsDoc.py module.
+See its __doc__ string for a discussion of the format.
+-->
+<tool name="gfortran">
+<summary>
+Sets construction variables for the GNU F95/F2003 GNU compiler.
+</summary>
+<sets>
+FORTRAN
+SHFORTRANFLAGS
+</sets>
+</tool>
diff --git a/src/engine/SCons/Tool/intelc.py b/src/engine/SCons/Tool/intelc.py
index 673c848..02cc52a 100644
--- a/src/engine/SCons/Tool/intelc.py
+++ b/src/engine/SCons/Tool/intelc.py
@@ -41,11 +41,14 @@ is_win64 = is_windows and (os.environ['PROCESSOR_ARCHITECTURE'] == 'AMD64' or
(os.environ.has_key('PROCESSOR_ARCHITEW6432') and
os.environ['PROCESSOR_ARCHITEW6432'] == 'AMD64'))
is_linux = sys.platform == 'linux2'
+is_mac = sys.platform == 'darwin'
if is_windows:
import SCons.Tool.msvc
elif is_linux:
import SCons.Tool.gcc
+elif is_mac:
+ import SCons.Tool.gcc
import SCons.Util
import SCons.Warnings
@@ -106,6 +109,11 @@ def check_abi(abi):
'x86_64' : 'x86_64',
'em64t' : 'x86_64',
'amd64' : 'x86_64'}
+ if is_mac:
+ valid_abis = {'ia32' : 'ia32',
+ 'x86' : 'ia32',
+ 'x86_64' : 'x86_64',
+ 'em64t' : 'x86_64'}
try:
abi = valid_abis[abi]
except KeyError:
@@ -196,8 +204,22 @@ def get_all_compiler_versions():
if ok:
versions.append(subkey)
else:
- # Registry points to nonexistent dir. Ignore this version.
- print "Ignoring "+str(get_intel_registry_value('ProductDir', subkey, 'IA32'))
+ try:
+ # Registry points to nonexistent dir. Ignore this
+ # version.
+ value = get_intel_registry_value('ProductDir', subkey, 'IA32')
+ except MissingRegistryError, e:
+
+ # Registry key is left dangling (potentially
+ # after uninstalling).
+
+ print \
+ "scons: *** Ignoring the registry key for the Intel compiler version %s.\n" \
+ "scons: *** It seems that the compiler was uninstalled and that the registry\n" \
+ "scons: *** was not cleaned up properly.\n" % subkey
+ else:
+ print "scons: *** Ignoring "+str(value)
+
i = i + 1
except EnvironmentError:
# no more subkeys
@@ -205,11 +227,22 @@ def get_all_compiler_versions():
elif is_linux:
for d in glob.glob('/opt/intel_cc_*'):
# Typical dir here is /opt/intel_cc_80.
- versions.append(re.search(r'cc_(.*)$', d).group(1))
+ m = re.search(r'cc_(.*)$', d)
+ if m:
+ versions.append(m.group(1))
+ for d in glob.glob('/opt/intel/cc*/*'):
+ # Typical dir here is /opt/intel/cc/9.0 for IA32,
+ # /opt/intel/cce/9.0 for EMT64 (AMD64)
+ m = re.search(r'([0-9.]+)$', d)
+ if m:
+ versions.append(m.group(1))
+ elif is_mac:
for d in glob.glob('/opt/intel/cc*/*'):
# Typical dir here is /opt/intel/cc/9.0 for IA32,
# /opt/intel/cce/9.0 for EMT64 (AMD64)
- versions.append(re.search(r'([0-9.]+)$', d).group(1))
+ m = re.search(r'([0-9.]+)$', d)
+ if m:
+ versions.append(m.group(1))
versions = uniquify(versions) # remove dups
versions.sort(vercmp)
return versions
@@ -229,7 +262,7 @@ def get_intel_compiler_top(version, abi):
if not os.path.exists(os.path.join(top, "Bin", "icl.exe")):
raise MissingDirError, \
"Can't find Intel compiler in %s"%(top)
- elif is_linux:
+ elif is_mac or is_linux:
# first dir is new (>=9.0) style, second is old (8.0) style.
dirs=('/opt/intel/cc/%s', '/opt/intel_cc_%s')
if abi == 'x86_64':
@@ -256,7 +289,7 @@ def generate(env, version=None, abi=None, topdir=None, verbose=0):
If topdir is used, version and abi are ignored.
verbose: (int) if >0, prints compiler version used.
"""
- if not (is_linux or is_windows):
+ if not (is_mac or is_linux or is_windows):
# can't handle this platform
return
@@ -264,6 +297,8 @@ def generate(env, version=None, abi=None, topdir=None, verbose=0):
SCons.Tool.msvc.generate(env)
elif is_linux:
SCons.Tool.gcc.generate(env)
+ elif is_mac:
+ SCons.Tool.gcc.generate(env)
# if version is unspecified, use latest
vlist = get_all_compiler_versions()
@@ -284,7 +319,7 @@ def generate(env, version=None, abi=None, topdir=None, verbose=0):
# alternatives are ia64 for Itanium, or amd64 or em64t or x86_64 (all synonyms here)
abi = check_abi(abi)
if abi is None:
- if is_linux:
+ if is_mac or is_linux:
# Check if we are on 64-bit linux, default to 64 then.
uname_m = os.uname()[4]
if uname_m == 'x86_64':
@@ -308,7 +343,7 @@ def generate(env, version=None, abi=None, topdir=None, verbose=0):
# on $PATH and the user is importing their env.
class ICLTopDirWarning(SCons.Warnings.Warning):
pass
- if is_linux and not env.Detect('icc') or \
+ if (is_mac or is_linux) and not env.Detect('icc') or \
is_windows and not env.Detect('icl'):
SCons.Warnings.enableWarningClass(ICLTopDirWarning)
@@ -325,11 +360,14 @@ def generate(env, version=None, abi=None, topdir=None, verbose=0):
if topdir:
if verbose:
- print "Intel C compiler: using version '%s' (%g), abi %s, in '%s'"%\
- (version, linux_ver_normalize(version),abi,topdir)
+ print "Intel C compiler: using version %s (%g), abi %s, in '%s'"%\
+ (repr(version), linux_ver_normalize(version),abi,topdir)
if is_linux:
# Show the actual compiler version by running the compiler.
os.system('%s/bin/icc --version'%topdir)
+ if is_mac:
+ # Show the actual compiler version by running the compiler.
+ os.system('%s/bin/icc --version'%topdir)
env['INTEL_C_COMPILER_TOP'] = topdir
if is_linux:
@@ -339,11 +377,22 @@ def generate(env, version=None, abi=None, topdir=None, verbose=0):
'LD_LIBRARY_PATH' : 'lib'}
for p in paths:
env.PrependENVPath(p, os.path.join(topdir, paths[p]))
+ if is_mac:
+ paths={'INCLUDE' : 'include',
+ 'LIB' : 'lib',
+ 'PATH' : 'bin',
+ 'LD_LIBRARY_PATH' : 'lib'}
+ for p in paths:
+ env.PrependENVPath(p, os.path.join(topdir, paths[p]))
if is_windows:
# env key reg valname default subdir of top
paths=(('INCLUDE', 'IncludeDir', 'Include'),
('LIB' , 'LibDir', 'Lib'),
('PATH' , 'BinDir', 'Bin'))
+ # We are supposed to ignore version if topdir is set, so set
+ # it to the emptry string if it's not already set.
+ if version is None:
+ version = ''
# Each path has a registry entry, use that or default to subdir
for p in paths:
try:
@@ -392,7 +441,9 @@ def generate(env, version=None, abi=None, topdir=None, verbose=0):
licdir = None
for ld in [envlicdir, reglicdir]:
- if ld and os.path.exists(ld):
+ # If the string contains an '@', then assume it's a network
+ # license (port@system) and good by definition.
+ if ld and (string.find(ld, '@') != -1 or os.path.exists(ld)):
licdir = ld
break
if not licdir:
@@ -409,7 +460,7 @@ def generate(env, version=None, abi=None, topdir=None, verbose=0):
env['ENV']['INTEL_LICENSE_FILE'] = licdir
def exists(env):
- if not (is_linux or is_windows):
+ if not (is_mac or is_linux or is_windows):
# can't handle this platform
return 0
@@ -424,6 +475,8 @@ def exists(env):
return env.Detect('icl')
elif is_linux:
return env.Detect('icc')
+ elif is_mac:
+ return env.Detect('icc')
return detected
# end of file
diff --git a/src/engine/SCons/Tool/jar.py b/src/engine/SCons/Tool/jar.py
index 4f221c0..6594ecc 100644
--- a/src/engine/SCons/Tool/jar.py
+++ b/src/engine/SCons/Tool/jar.py
@@ -38,19 +38,32 @@ import SCons.Util
def jarSources(target, source, env, for_signature):
"""Only include sources that are not a manifest file."""
- jarchdir = env.subst('$JARCHDIR', target=target, source=source)
- if jarchdir:
- jarchdir = env.fs.Dir(jarchdir)
+ try:
+ env['JARCHDIR']
+ except KeyError:
+ jarchdir_set = False
+ else:
+ jarchdir_set = True
+ jarchdir = env.subst('$JARCHDIR', target=target, source=source)
+ if jarchdir:
+ jarchdir = env.fs.Dir(jarchdir)
result = []
for src in source:
contents = src.get_contents()
if contents[:16] != "Manifest-Version":
- if jarchdir:
+ if jarchdir_set:
+ _chdir = jarchdir
+ else:
+ try:
+ _chdir = src.attributes.java_classdir
+ except AttributeError:
+ _chdir = None
+ if _chdir:
# If we are changing the dir with -C, then sources should
# be relative to that directory.
- src = SCons.Subst.Literal(src.get_path(jarchdir))
+ src = SCons.Subst.Literal(src.get_path(_chdir))
result.append('-C')
- result.append(jarchdir)
+ result.append(_chdir)
result.append(src)
return result
diff --git a/src/engine/SCons/Tool/jar.xml b/src/engine/SCons/Tool/jar.xml
index a0d730e..9e8fefa 100644
--- a/src/engine/SCons/Tool/jar.xml
+++ b/src/engine/SCons/Tool/jar.xml
@@ -34,6 +34,11 @@ If the &cv-link-JARCHDIR; value is set, the
command will change to the specified directory using the
<option>-C</option>
option.
+If &cv-JARCHDIR; is not set explicitly,
+&SCons; will use the top of any subdirectory tree
+in which Java <filename>.class</filename>
+were built by the &b-link-Java; Builder.
+
If the contents any of the source files begin with the string
<literal>Manifest-Version</literal>,
the file is assumed to be a manifest
diff --git a/src/engine/SCons/Tool/link.py b/src/engine/SCons/Tool/link.py
index be1a81a..b60aa87 100644
--- a/src/engine/SCons/Tool/link.py
+++ b/src/engine/SCons/Tool/link.py
@@ -44,6 +44,11 @@ def smart_link(source, target, env, for_signature):
return '$CXX'
return '$CC'
+def shlib_emitter(target, source, env):
+ for tgt in target:
+ tgt.attributes.shared = 1
+ return (target, source)
+
def generate(env):
"""Add Builders and construction variables for gnulink to an Environment."""
SCons.Tool.createSharedLibBuilder(env)
@@ -54,14 +59,14 @@ def generate(env):
env['SHLINKCOM'] = '$SHLINK -o $TARGET $SHLINKFLAGS $SOURCES $_LIBDIRFLAGS $_LIBFLAGS'
# don't set up the emitter, cause AppendUnique will generate a list
# starting with None :-(
- #env['SHLIBEMITTER']= None
+ env.Append(SHLIBEMITTER = [shlib_emitter])
env['SMARTLINK'] = smart_link
env['LINK'] = "$SMARTLINK"
env['LINKFLAGS'] = SCons.Util.CLVar('')
env['LINKCOM'] = '$LINK -o $TARGET $LINKFLAGS $SOURCES $_LIBDIRFLAGS $_LIBFLAGS'
env['LIBDIRPREFIX']='-L'
env['LIBDIRSUFFIX']=''
- env['_LIBFLAGS']='${_stripixes(LIBLINKPREFIX, LIBS, LIBLINKSUFFIX, LIBPREFIX, LIBSUFFIX, __env__)}'
+ env['_LIBFLAGS']='${_stripixes(LIBLINKPREFIX, LIBS, LIBLINKSUFFIX, LIBPREFIXES, LIBSUFFIXES, __env__)}'
env['LIBLINKPREFIX']='-l'
env['LIBLINKSUFFIX']=''
diff --git a/src/engine/SCons/Tool/mslink.py b/src/engine/SCons/Tool/mslink.py
index 25f3564..42eabaf 100644
--- a/src/engine/SCons/Tool/mslink.py
+++ b/src/engine/SCons/Tool/mslink.py
@@ -76,6 +76,9 @@ def windowsShlinkSources(target, source, env, for_signature):
def windowsLibEmitter(target, source, env):
SCons.Tool.msvc.validate_vars(env)
+ extratargets = []
+ extrasources = []
+
dll = env.FindIxes(target, "SHLIBPREFIX", "SHLIBSUFFIX")
no_import_lib = env.get('no_import_lib', 0)
@@ -87,38 +90,44 @@ def windowsLibEmitter(target, source, env):
not env.FindIxes(source, "WINDOWSDEFPREFIX", "WINDOWSDEFSUFFIX"):
# append a def file to the list of sources
- source.append(env.ReplaceIxes(dll,
- "SHLIBPREFIX", "SHLIBSUFFIX",
- "WINDOWSDEFPREFIX", "WINDOWSDEFSUFFIX"))
+ extrasources.append(
+ env.ReplaceIxes(dll,
+ "SHLIBPREFIX", "SHLIBSUFFIX",
+ "WINDOWSDEFPREFIX", "WINDOWSDEFSUFFIX"))
version_num, suite = SCons.Tool.msvs.msvs_parse_version(env.get('MSVS_VERSION', '6.0'))
if version_num >= 8.0 and env.get('WINDOWS_INSERT_MANIFEST', 0):
# MSVC 8 automatically generates .manifest files that must be installed
- target.append(env.ReplaceIxes(dll,
- "SHLIBPREFIX", "SHLIBSUFFIX",
- "WINDOWSSHLIBMANIFESTPREFIX", "WINDOWSSHLIBMANIFESTSUFFIX"))
+ extratargets.append(
+ env.ReplaceIxes(dll,
+ "SHLIBPREFIX", "SHLIBSUFFIX",
+ "WINDOWSSHLIBMANIFESTPREFIX", "WINDOWSSHLIBMANIFESTSUFFIX"))
if env.has_key('PDB') and env['PDB']:
pdb = env.arg2nodes('$PDB', target=target, source=source)[0]
- target.append(pdb)
+ extratargets.append(pdb)
target[0].attributes.pdb = pdb
if not no_import_lib and \
not env.FindIxes(target, "LIBPREFIX", "LIBSUFFIX"):
# Append an import library to the list of targets.
- target.append(env.ReplaceIxes(dll,
- "SHLIBPREFIX", "SHLIBSUFFIX",
- "LIBPREFIX", "LIBSUFFIX"))
+ extratargets.append(
+ env.ReplaceIxes(dll,
+ "SHLIBPREFIX", "SHLIBSUFFIX",
+ "LIBPREFIX", "LIBSUFFIX"))
# and .exp file is created if there are exports from a DLL
- target.append(env.ReplaceIxes(dll,
- "SHLIBPREFIX", "SHLIBSUFFIX",
- "WINDOWSEXPPREFIX", "WINDOWSEXPSUFFIX"))
+ extratargets.append(
+ env.ReplaceIxes(dll,
+ "SHLIBPREFIX", "SHLIBSUFFIX",
+ "WINDOWSEXPPREFIX", "WINDOWSEXPSUFFIX"))
- return (target, source)
+ return (target+extratargets, source+extrasources)
def prog_emitter(target, source, env):
SCons.Tool.msvc.validate_vars(env)
+ extratargets = []
+
exe = env.FindIxes(target, "PROGPREFIX", "PROGSUFFIX")
if not exe:
raise SCons.Errors.UserError, "An executable should have exactly one target with the suffix: %s" % env.subst("$PROGSUFFIX")
@@ -126,16 +135,17 @@ def prog_emitter(target, source, env):
version_num, suite = SCons.Tool.msvs.msvs_parse_version(env.get('MSVS_VERSION', '6.0'))
if version_num >= 8.0 and env.get('WINDOWS_INSERT_MANIFEST', 0):
# MSVC 8 automatically generates .manifest files that have to be installed
- target.append(env.ReplaceIxes(exe,
- "PROGPREFIX", "PROGSUFFIX",
- "WINDOWSPROGMANIFESTPREFIX", "WINDOWSPROGMANIFESTSUFFIX"))
+ extratargets.append(
+ env.ReplaceIxes(exe,
+ "PROGPREFIX", "PROGSUFFIX",
+ "WINDOWSPROGMANIFESTPREFIX", "WINDOWSPROGMANIFESTSUFFIX"))
if env.has_key('PDB') and env['PDB']:
pdb = env.arg2nodes('$PDB', target=target, source=source)[0]
- target.append(pdb)
+ extratargets.append(pdb)
target[0].attributes.pdb = pdb
- return (target,source)
+ return (target+extratargets,source)
def RegServerFunc(target, source, env):
if env.has_key('register') and env['register']:
diff --git a/src/engine/SCons/Tool/qt.py b/src/engine/SCons/Tool/qt.py
index 105f42e..d67cddb 100644
--- a/src/engine/SCons/Tool/qt.py
+++ b/src/engine/SCons/Tool/qt.py
@@ -66,7 +66,7 @@ def checkMocIncluded(target, source, env):
cpp = source[0]
# looks like cpp.includes is cleared before the build stage :-(
# not really sure about the path transformations (moc.cwd? cpp.cwd?) :-/
- path = SCons.Defaults.CScan.path_function(env, moc.cwd)
+ path = SCons.Defaults.CScan.path(env, moc.cwd)
includes = SCons.Defaults.CScan(cpp, env, path)
if not moc in includes:
SCons.Warnings.warn(
diff --git a/src/engine/SCons/Tool/rmic.py b/src/engine/SCons/Tool/rmic.py
index 4b48e0b..ed5c8ee 100644
--- a/src/engine/SCons/Tool/rmic.py
+++ b/src/engine/SCons/Tool/rmic.py
@@ -79,9 +79,13 @@ def emit_rmic_classes(target, source, env):
s.attributes.java_classname = classname
slist.append(s)
+ stub_suffixes = ['_Stub']
+ if env.get('JAVAVERSION') == '1.4':
+ stub_suffixes.append('_Skel')
+
tlist = []
for s in source:
- for suff in ['_Skel', '_Stub']:
+ for suff in stub_suffixes:
fname = string.replace(s.attributes.java_classname, '.', os.sep) + \
suff + class_suffix
t = target[0].File(fname)
diff --git a/src/engine/SCons/Tool/swig.py b/src/engine/SCons/Tool/swig.py
index 8ca1b89..eba49a7 100644
--- a/src/engine/SCons/Tool/swig.py
+++ b/src/engine/SCons/Tool/swig.py
@@ -50,7 +50,8 @@ def swigSuffixEmitter(env, source):
else:
return '$SWIGCFILESUFFIX'
-_reModule = re.compile(r'%module\s+(.+)')
+# Match '%module test', as well as '%module(directors="1") test'
+_reModule = re.compile(r'%module(?:\s*\(.*\))?\s+(.+)')
def _swigEmitter(target, source, env):
swigflags = env.subst("$SWIGFLAGS", target=target, source=source)
diff --git a/src/engine/SCons/Tool/tex.py b/src/engine/SCons/Tool/tex.py
index bbae25e..c3156a3 100644
--- a/src/engine/SCons/Tool/tex.py
+++ b/src/engine/SCons/Tool/tex.py
@@ -42,7 +42,7 @@ import SCons.Node
import SCons.Node.FS
import SCons.Util
-warning_rerun_re = re.compile("^LaTeX Warning:.*Rerun", re.MULTILINE)
+warning_rerun_re = re.compile('(^LaTeX Warning:.*Rerun)|(^Package \w+ Warning:.*Rerun)', re.MULTILINE)
rerun_citations_str = "^LaTeX Warning:.*\n.*Rerun to get citations correct"
rerun_citations_re = re.compile(rerun_citations_str, re.MULTILINE)
@@ -76,26 +76,52 @@ def InternalLaTeXAuxAction(XXXLaTeXAction, target = None, source= None, env=None
basename = SCons.Util.splitext(str(source[0]))[0]
basedir = os.path.split(str(source[0]))[0]
-
- # Notice that all the filenames are not prefixed with the basedir.
- # That's because the *COM variables have the cd command in the prolog.
-
- bblfilename = basename + '.bbl'
+ basefile = os.path.split(str(basename))[1]
+ abspath = os.path.abspath(basedir)
+ targetbase = SCons.Util.splitext(str(target[0]))[0]
+ targetdir = os.path.split(str(target[0]))[0]
+
+ # Not sure if these environment changes should go here or make the
+ # user do them I undo all but TEXPICTS but there is still the side
+ # effect of creating the empty (':') entries in the environment.
+
+ def modify_env_var(env, var, abspath):
+ try:
+ save = env['ENV'][var]
+ except KeyError:
+ save = ':'
+ env['ENV'][var] = ''
+ if SCons.Util.is_List(env['ENV'][var]):
+ env['ENV'][var] = [abspath] + env['ENV'][var]
+ else:
+ env['ENV'][var] = abspath + os.pathsep + env['ENV'][var]
+ return save
+
+ texinputs_save = modify_env_var(env, 'TEXINPUTS', abspath)
+ bibinputs_save = modify_env_var(env, 'BIBINPUTS', abspath)
+ bstinputs_save = modify_env_var(env, 'BSTINPUTS', abspath)
+ texpicts_save = modify_env_var(env, 'TEXPICTS', abspath)
+
+ # Create these file names with the target directory since they will
+ # be made there. That's because the *COM variables have the cd
+ # command in the prolog.
+
+ bblfilename = os.path.join(targetdir, basefile + '.bbl')
bblContents = ""
if os.path.exists(bblfilename):
bblContents = open(bblfilename, "rb").read()
- idxfilename = basename + '.idx'
+ idxfilename = os.path.join(targetdir, basefile + '.idx')
idxContents = ""
if os.path.exists(idxfilename):
idxContents = open(idxfilename, "rb").read()
- tocfilename = basename + '.toc'
+ tocfilename = os.path.join(targetdir, basefile + '.toc')
tocContents = ""
if os.path.exists(tocfilename):
tocContents = open(tocfilename, "rb").read()
- # Run LaTeX once to generate a new aux file.
+ # Run LaTeX once to generate a new aux file and log file.
XXXLaTeXAction(target, source, env)
# Decide if various things need to be run, or run again. We check
@@ -104,7 +130,7 @@ def InternalLaTeXAuxAction(XXXLaTeXAction, target = None, source= None, env=None
# with stubs that don't necessarily generate all of the same files.
# Read the log file to find all .aux files
- logfilename = basename + '.log'
+ logfilename = os.path.join(targetbase + '.log')
auxfiles = []
if os.path.exists(logfilename):
content = open(logfilename, "rb").read()
@@ -112,10 +138,11 @@ def InternalLaTeXAuxAction(XXXLaTeXAction, target = None, source= None, env=None
# Now decide if bibtex will need to be run.
for auxfilename in auxfiles:
- if os.path.exists(os.path.join(basedir, auxfilename)):
- content = open(os.path.join(basedir, auxfilename), "rb").read()
+ target_aux = os.path.join(targetdir, auxfilename)
+ if os.path.exists(target_aux):
+ content = open(target_aux, "rb").read()
if string.find(content, "bibdata") != -1:
- bibfile = env.fs.File(basename)
+ bibfile = env.fs.File(targetbase)
BibTeXAction(bibfile, bibfile, env)
break
@@ -131,7 +158,7 @@ def InternalLaTeXAuxAction(XXXLaTeXAction, target = None, source= None, env=None
# Now decide if latex will need to be run again due to index.
if os.path.exists(idxfilename) and idxContents != open(idxfilename, "rb").read():
# We must run makeindex
- idxfile = env.fs.File(basename)
+ idxfile = env.fs.File(targetbase)
MakeIndexAction(idxfile, idxfile, env)
must_rerun_latex = 1
@@ -139,7 +166,7 @@ def InternalLaTeXAuxAction(XXXLaTeXAction, target = None, source= None, env=None
XXXLaTeXAction(target, source, env)
# Now decide if latex needs to be run yet again to resolve warnings.
- logfilename = basename + '.log'
+ logfilename = targetbase + '.log'
for _ in range(int(env.subst('$LATEXRETRIES'))):
if not os.path.exists(logfilename):
break
@@ -149,6 +176,15 @@ def InternalLaTeXAuxAction(XXXLaTeXAction, target = None, source= None, env=None
not undefined_references_re.search(content):
break
XXXLaTeXAction(target, source, env)
+
+ env['ENV']['TEXINPUTS'] = texinputs_save
+ env['ENV']['BIBINPUTS'] = bibinputs_save
+ env['ENV']['BSTINPUTS'] = bibinputs_save
+
+ # The TEXPICTS enviroment variable is needed by a dvi -> pdf step
+ # later on Mac OSX so leave it,
+ # env['ENV']['TEXPICTS'] = texpicts_save
+
return 0
def LaTeXAuxAction(target = None, source= None, env=None):
@@ -176,27 +212,29 @@ def TeXLaTeXFunction(target = None, source= None, env=None):
def tex_emitter(target, source, env):
base = SCons.Util.splitext(str(source[0]))[0]
- target.append(base + '.aux')
- env.Precious(base + '.aux')
- target.append(base + '.log')
+ targetbase = SCons.Util.splitext(str(target[0]))[0]
+
+ target.append(targetbase + '.aux')
+ env.Precious(targetbase + '.aux')
+ target.append(targetbase + '.log')
for f in source:
content = f.get_contents()
if tableofcontents_re.search(content):
- target.append(base + '.toc')
- env.Precious(base + '.toc')
+ target.append(targetbase + '.toc')
+ env.Precious(targetbase + '.toc')
if makeindex_re.search(content):
- target.append(base + '.ilg')
- target.append(base + '.ind')
- target.append(base + '.idx')
- env.Precious(base + '.idx')
+ target.append(targetbase + '.ilg')
+ target.append(targetbase + '.ind')
+ target.append(targetbase + '.idx')
+ env.Precious(targetbase + '.idx')
if bibliography_re.search(content):
- target.append(base + '.bbl')
- env.Precious(base + '.bbl')
- target.append(base + '.blg')
+ target.append(targetbase + '.bbl')
+ env.Precious(targetbase + '.bbl')
+ target.append(targetbase + '.blg')
- # read log file to get all output file (include .aux files)
- logfilename = base + '.log'
- dir, base_nodir = os.path.split(base)
+ # read log file to get all .aux files
+ logfilename = targetbase + '.log'
+ dir, base_nodir = os.path.split(targetbase)
if os.path.exists(logfilename):
content = open(logfilename, "rb").read()
out_files = openout_re.findall(content)
diff --git a/src/engine/SCons/Tool/yacc.py b/src/engine/SCons/Tool/yacc.py
index 34f60cb..0b648e8 100644
--- a/src/engine/SCons/Tool/yacc.py
+++ b/src/engine/SCons/Tool/yacc.py
@@ -54,7 +54,7 @@ def _yaccEmitter(target, source, env, ysuf, hsuf):
# If -d is specified on the command line, yacc will emit a .h
# or .hpp file with the same name as the .c or .cpp output file.
if '-d' in flags:
- target.append(targetBase + env.subst(hsuf))
+ target.append(targetBase + env.subst(hsuf, target=target, source=source))
# If -g is specified on the command line, yacc will emit a .vcg
# file with the same base name as the .y, .yacc, .ym or .yy file.
@@ -108,7 +108,14 @@ def generate(env):
env['YACCFLAGS'] = SCons.Util.CLVar('')
env['YACCCOM'] = '$YACC $YACCFLAGS -o $TARGET $SOURCES'
env['YACCHFILESUFFIX'] = '.h'
- env['YACCHXXFILESUFFIX'] = '.hpp'
+
+ if env['PLATFORM'] == 'darwin':
+ # Bison on Mac OS X just appends ".h" to the generated target .cc
+ # or .cpp file name. Hooray for delayed expansion of variables.
+ env['YACCHXXFILESUFFIX'] = '${TARGET.suffix}.h'
+ else:
+ env['YACCHXXFILESUFFIX'] = '.hpp'
+
env['YACCVCGFILESUFFIX'] = '.vcg'
def exists(env):
diff --git a/src/engine/SCons/Tool/yacc.xml b/src/engine/SCons/Tool/yacc.xml
index 2db0603..aa648b1 100644
--- a/src/engine/SCons/Tool/yacc.xml
+++ b/src/engine/SCons/Tool/yacc.xml
@@ -87,7 +87,13 @@ file with the specified suffix,
it exists to allow you to specify
what suffix the parser generator will use of its own accord.
The default value is
-<filename>.hpp</filename>.
+<filename>.hpp</filename>,
+except on Mac OS X,
+where the default is
+<filename>${TARGET.suffix}.h</filename>.
+because the default &bison; parser generator just
+appends <filename>.h</filename>
+to the name of the generated C++ file.
</summary>
</cvar>
diff --git a/src/engine/SCons/Util.py b/src/engine/SCons/Util.py
index 258de0f..08ce1f2 100644
--- a/src/engine/SCons/Util.py
+++ b/src/engine/SCons/Util.py
@@ -133,10 +133,17 @@ def to_String_for_signature(obj):
try:
f = obj.for_signature
except AttributeError:
- return to_String(obj)
+ return to_String_for_subst(obj)
else:
return f()
+def to_String_for_subst(s):
+ if is_Sequence( s ):
+ return string.join( map(to_String_for_subst, s) )
+
+ return to_String( s )
+
+
class CallableComposite(UserList):
"""A simple composite callable class that, when called, will invoke all
of its contained callables with the same arguments."""
@@ -344,55 +351,80 @@ def print_tree(root, child_func, prune=0, showtags=0, margin=[0], visited={}):
# Yes, all of this manual testing breaks polymorphism, and the real
# Pythonic way to do all of this would be to just try it and handle the
# exception, but handling the exception when it's not the right type is
-# too slow.
-#
-# The actual implementations here have been selected after timings
-# coded up in in bench/is_types.py (from the SCons source tree, see the
-# scons-src distribution). Key results from those timings:
-#
-# -- Storing the type of the object in a variable (t = type(obj))
-# slows down the case where it's a native type and the first
-# comparison will match, but nicely speeds up the case where
-# it's a different native type. Since that's going to be common,
-# it's a good tradeoff.
-#
-# -- The data show that calling isinstance() on an object that's
-# a native type (dict, list or string) is expensive enough that
-# checking up front for whether the object is of type InstanceType
-# is a pretty big win, even though it does slow down the case
-# where it really *is* an object instance a little bit.
-
-def is_Dict(obj):
- t = type(obj)
- return t is DictType or \
- (t is InstanceType and isinstance(obj, UserDict))
-
-def is_List(obj):
- t = type(obj)
- return t is ListType \
- or (t is InstanceType and isinstance(obj, UserList))
-
-def is_Sequence(obj):
- t = type(obj)
- return t is ListType \
- or t is TupleType \
- or (t is InstanceType and isinstance(obj, UserList))
-
-def is_Tuple(obj):
- t = type(obj)
- return t is TupleType
+# often too slow.
-if hasattr(types, 'UnicodeType'):
- def is_String(obj):
+try:
+ class mystr(str):
+ pass
+except TypeError:
+ # An older Python version without new-style classes.
+ #
+ # The actual implementations here have been selected after timings
+ # coded up in in bench/is_types.py (from the SCons source tree,
+ # see the scons-src distribution), mostly against Python 1.5.2.
+ # Key results from those timings:
+ #
+ # -- Storing the type of the object in a variable (t = type(obj))
+ # slows down the case where it's a native type and the first
+ # comparison will match, but nicely speeds up the case where
+ # it's a different native type. Since that's going to be
+ # common, it's a good tradeoff.
+ #
+ # -- The data show that calling isinstance() on an object that's
+ # a native type (dict, list or string) is expensive enough
+ # that checking up front for whether the object is of type
+ # InstanceType is a pretty big win, even though it does slow
+ # down the case where it really *is* an object instance a
+ # little bit.
+ def is_Dict(obj):
+ t = type(obj)
+ return t is DictType or \
+ (t is InstanceType and isinstance(obj, UserDict))
+
+ def is_List(obj):
t = type(obj)
- return t is StringType \
- or t is UnicodeType \
- or (t is InstanceType and isinstance(obj, UserString))
+ return t is ListType \
+ or (t is InstanceType and isinstance(obj, UserList))
+
+ def is_Sequence(obj):
+ t = type(obj)
+ return t is ListType \
+ or t is TupleType \
+ or (t is InstanceType and isinstance(obj, UserList))
+
+ def is_Tuple(obj):
+ t = type(obj)
+ return t is TupleType
+
+ if hasattr(types, 'UnicodeType'):
+ def is_String(obj):
+ t = type(obj)
+ return t is StringType \
+ or t is UnicodeType \
+ or (t is InstanceType and isinstance(obj, UserString))
+ else:
+ def is_String(obj):
+ t = type(obj)
+ return t is StringType \
+ or (t is InstanceType and isinstance(obj, UserString))
else:
+ # A modern Python version with new-style classes, so we can just use
+ # isinstance().
+ def is_Dict(obj):
+ return isinstance(obj, (dict, UserDict))
+
+ def is_List(obj):
+ return isinstance(obj, (list, UserList))
+
+ def is_Sequence(obj):
+ return isinstance(obj, (list, UserList, tuple))
+
+ def is_Tuple(obj):
+ return isinstance(obj, (tuple))
+
def is_String(obj):
- t = type(obj)
- return t is StringType \
- or (t is InstanceType and isinstance(obj, UserString))
+ # Empirically, Python versions with new-style classes all have unicode.
+ return isinstance(obj, (str, unicode, UserString))
diff --git a/src/engine/SCons/UtilTests.py b/src/engine/SCons/UtilTests.py
index 3e8085b..44d6fa8 100644
--- a/src/engine/SCons/UtilTests.py
+++ b/src/engine/SCons/UtilTests.py
@@ -205,6 +205,13 @@ class UtilTestCase(unittest.TestCase):
def test_is_Dict(self):
assert is_Dict({})
assert is_Dict(UserDict())
+ try:
+ class mydict(dict):
+ pass
+ except TypeError:
+ pass
+ else:
+ assert is_Dict(mydict({}))
assert not is_Dict([])
assert not is_Dict(())
assert not is_Dict("")
@@ -215,6 +222,13 @@ class UtilTestCase(unittest.TestCase):
assert is_List([])
import UserList
assert is_List(UserList.UserList())
+ try:
+ class mylist(list):
+ pass
+ except TypeError:
+ pass
+ else:
+ assert is_List(mylist([]))
assert not is_List(())
assert not is_List({})
assert not is_List("")
@@ -231,12 +245,26 @@ class UtilTestCase(unittest.TestCase):
pass
else:
assert is_String(UserString.UserString(''))
+ try:
+ class mystr(str):
+ pass
+ except TypeError:
+ pass
+ else:
+ assert is_String(mystr(''))
assert not is_String({})
assert not is_String([])
assert not is_String(())
def test_is_Tuple(self):
assert is_Tuple(())
+ try:
+ class mytuple(tuple):
+ pass
+ except TypeError:
+ pass
+ else:
+ assert is_Tuple(mytuple(()))
assert not is_Tuple([])
assert not is_Tuple({})
assert not is_Tuple("")
diff --git a/src/engine/SCons/Warnings.py b/src/engine/SCons/Warnings.py
index b1d39ec..5354959 100644
--- a/src/engine/SCons/Warnings.py
+++ b/src/engine/SCons/Warnings.py
@@ -73,6 +73,9 @@ class NoParallelSupportWarning(Warning):
class ReservedVariableWarning(Warning):
pass
+class StackSizeWarning(Warning):
+ pass
+
_warningAsException = 0
# The below is a list of 2-tuples. The first element is a class object.
diff --git a/src/engine/SCons/compat/__init__.py b/src/engine/SCons/compat/__init__.py
index 47ae3be..91e3776 100644
--- a/src/engine/SCons/compat/__init__.py
+++ b/src/engine/SCons/compat/__init__.py
@@ -158,20 +158,14 @@ import shlex
try:
shlex.split
except AttributeError:
- # Pre-2.3 Python has no shlex.split function.
- def split(s, comments=False):
- import StringIO
- lex = shlex.shlex(StringIO.StringIO(s))
- lex.wordchars = lex.wordchars + '/\\-+,=:'
- result = []
- while True:
- tt = lex.get_token()
- if not tt:
- break
- result.append(tt)
- return result
- shlex.split = split
- del split
+ # Pre-2.3 Python has no shlex.split() function.
+ #
+ # The full white-space splitting semantics of shlex.split() are
+ # complicated to reproduce by hand, so just use a compatibility
+ # version of the shlex module cribbed from Python 2.5 with some
+ # minor modifications for older Python versions.
+ del shlex
+ import_as('_scons_shlex', 'shlex')
try:
import subprocess
diff --git a/src/engine/SCons/compat/_scons_shlex.py b/src/engine/SCons/compat/_scons_shlex.py
new file mode 100644
index 0000000..d6c1035
--- /dev/null
+++ b/src/engine/SCons/compat/_scons_shlex.py
@@ -0,0 +1,319 @@
+# -*- coding: iso-8859-1 -*-
+"""A lexical analyzer class for simple shell-like syntaxes."""
+
+# Module and documentation by Eric S. Raymond, 21 Dec 1998
+# Input stacking and error message cleanup added by ESR, March 2000
+# push_source() and pop_source() made explicit by ESR, January 2001.
+# Posix compliance, split(), string arguments, and
+# iterator interface by Gustavo Niemeyer, April 2003.
+
+import os.path
+import sys
+#from collections import deque
+
+class deque:
+ def __init__(self):
+ self.data = []
+ def __len__(self):
+ return len(self.data)
+ def appendleft(self, item):
+ self.data.insert(0, item)
+ def popleft(self):
+ return self.data.pop(0)
+
+try:
+ basestring
+except NameError:
+ import types
+ def is_basestring(s):
+ return type(s) is types.StringType
+else:
+ def is_basestring(s):
+ return isinstance(s, basestring)
+
+try:
+ from cStringIO import StringIO
+except ImportError:
+ from StringIO import StringIO
+
+__all__ = ["shlex", "split"]
+
+class shlex:
+ "A lexical analyzer class for simple shell-like syntaxes."
+ def __init__(self, instream=None, infile=None, posix=False):
+ if is_basestring(instream):
+ instream = StringIO(instream)
+ if instream is not None:
+ self.instream = instream
+ self.infile = infile
+ else:
+ self.instream = sys.stdin
+ self.infile = None
+ self.posix = posix
+ if posix:
+ self.eof = None
+ else:
+ self.eof = ''
+ self.commenters = '#'
+ self.wordchars = ('abcdfeghijklmnopqrstuvwxyz'
+ 'ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789_')
+ if self.posix:
+ self.wordchars = self.wordchars + ('ßàáâãäåæçèéêëìíîïðñòóôõöøùúûüýþÿ'
+ 'ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏÐÑÒÓÔÕÖØÙÚÛÜÝÞ')
+ self.whitespace = ' \t\r\n'
+ self.whitespace_split = False
+ self.quotes = '\'"'
+ self.escape = '\\'
+ self.escapedquotes = '"'
+ self.state = ' '
+ self.pushback = deque()
+ self.lineno = 1
+ self.debug = 0
+ self.token = ''
+ self.filestack = deque()
+ self.source = None
+ if self.debug:
+ print 'shlex: reading from %s, line %d' \
+ % (self.instream, self.lineno)
+
+ def push_token(self, tok):
+ "Push a token onto the stack popped by the get_token method"
+ if self.debug >= 1:
+ print "shlex: pushing token " + repr(tok)
+ self.pushback.appendleft(tok)
+
+ def push_source(self, newstream, newfile=None):
+ "Push an input source onto the lexer's input source stack."
+ if is_basestring(newstream):
+ newstream = StringIO(newstream)
+ self.filestack.appendleft((self.infile, self.instream, self.lineno))
+ self.infile = newfile
+ self.instream = newstream
+ self.lineno = 1
+ if self.debug:
+ if newfile is not None:
+ print 'shlex: pushing to file %s' % (self.infile,)
+ else:
+ print 'shlex: pushing to stream %s' % (self.instream,)
+
+ def pop_source(self):
+ "Pop the input source stack."
+ self.instream.close()
+ (self.infile, self.instream, self.lineno) = self.filestack.popleft()
+ if self.debug:
+ print 'shlex: popping to %s, line %d' \
+ % (self.instream, self.lineno)
+ self.state = ' '
+
+ def get_token(self):
+ "Get a token from the input stream (or from stack if it's nonempty)"
+ if self.pushback:
+ tok = self.pushback.popleft()
+ if self.debug >= 1:
+ print "shlex: popping token " + repr(tok)
+ return tok
+ # No pushback. Get a token.
+ raw = self.read_token()
+ # Handle inclusions
+ if self.source is not None:
+ while raw == self.source:
+ spec = self.sourcehook(self.read_token())
+ if spec:
+ (newfile, newstream) = spec
+ self.push_source(newstream, newfile)
+ raw = self.get_token()
+ # Maybe we got EOF instead?
+ while raw == self.eof:
+ if not self.filestack:
+ return self.eof
+ else:
+ self.pop_source()
+ raw = self.get_token()
+ # Neither inclusion nor EOF
+ if self.debug >= 1:
+ if raw != self.eof:
+ print "shlex: token=" + repr(raw)
+ else:
+ print "shlex: token=EOF"
+ return raw
+
+ def read_token(self):
+ quoted = False
+ escapedstate = ' '
+ while True:
+ nextchar = self.instream.read(1)
+ if nextchar == '\n':
+ self.lineno = self.lineno + 1
+ if self.debug >= 3:
+ print "shlex: in state", repr(self.state), \
+ "I see character:", repr(nextchar)
+ if self.state is None:
+ self.token = '' # past end of file
+ break
+ elif self.state == ' ':
+ if not nextchar:
+ self.state = None # end of file
+ break
+ elif nextchar in self.whitespace:
+ if self.debug >= 2:
+ print "shlex: I see whitespace in whitespace state"
+ if self.token or (self.posix and quoted):
+ break # emit current token
+ else:
+ continue
+ elif nextchar in self.commenters:
+ self.instream.readline()
+ self.lineno = self.lineno + 1
+ elif self.posix and nextchar in self.escape:
+ escapedstate = 'a'
+ self.state = nextchar
+ elif nextchar in self.wordchars:
+ self.token = nextchar
+ self.state = 'a'
+ elif nextchar in self.quotes:
+ if not self.posix:
+ self.token = nextchar
+ self.state = nextchar
+ elif self.whitespace_split:
+ self.token = nextchar
+ self.state = 'a'
+ else:
+ self.token = nextchar
+ if self.token or (self.posix and quoted):
+ break # emit current token
+ else:
+ continue
+ elif self.state in self.quotes:
+ quoted = True
+ if not nextchar: # end of file
+ if self.debug >= 2:
+ print "shlex: I see EOF in quotes state"
+ # XXX what error should be raised here?
+ raise ValueError, "No closing quotation"
+ if nextchar == self.state:
+ if not self.posix:
+ self.token = self.token + nextchar
+ self.state = ' '
+ break
+ else:
+ self.state = 'a'
+ elif self.posix and nextchar in self.escape and \
+ self.state in self.escapedquotes:
+ escapedstate = self.state
+ self.state = nextchar
+ else:
+ self.token = self.token + nextchar
+ elif self.state in self.escape:
+ if not nextchar: # end of file
+ if self.debug >= 2:
+ print "shlex: I see EOF in escape state"
+ # XXX what error should be raised here?
+ raise ValueError, "No escaped character"
+ # In posix shells, only the quote itself or the escape
+ # character may be escaped within quotes.
+ if escapedstate in self.quotes and \
+ nextchar != self.state and nextchar != escapedstate:
+ self.token = self.token + self.state
+ self.token = self.token + nextchar
+ self.state = escapedstate
+ elif self.state == 'a':
+ if not nextchar:
+ self.state = None # end of file
+ break
+ elif nextchar in self.whitespace:
+ if self.debug >= 2:
+ print "shlex: I see whitespace in word state"
+ self.state = ' '
+ if self.token or (self.posix and quoted):
+ break # emit current token
+ else:
+ continue
+ elif nextchar in self.commenters:
+ self.instream.readline()
+ self.lineno = self.lineno + 1
+ if self.posix:
+ self.state = ' '
+ if self.token or (self.posix and quoted):
+ break # emit current token
+ else:
+ continue
+ elif self.posix and nextchar in self.quotes:
+ self.state = nextchar
+ elif self.posix and nextchar in self.escape:
+ escapedstate = 'a'
+ self.state = nextchar
+ elif nextchar in self.wordchars or nextchar in self.quotes \
+ or self.whitespace_split:
+ self.token = self.token + nextchar
+ else:
+ self.pushback.appendleft(nextchar)
+ if self.debug >= 2:
+ print "shlex: I see punctuation in word state"
+ self.state = ' '
+ if self.token:
+ break # emit current token
+ else:
+ continue
+ result = self.token
+ self.token = ''
+ if self.posix and not quoted and result == '':
+ result = None
+ if self.debug > 1:
+ if result:
+ print "shlex: raw token=" + repr(result)
+ else:
+ print "shlex: raw token=EOF"
+ return result
+
+ def sourcehook(self, newfile):
+ "Hook called on a filename to be sourced."
+ if newfile[0] == '"':
+ newfile = newfile[1:-1]
+ # This implements cpp-like semantics for relative-path inclusion.
+ if is_basestring(self.infile) and not os.path.isabs(newfile):
+ newfile = os.path.join(os.path.dirname(self.infile), newfile)
+ return (newfile, open(newfile, "r"))
+
+ def error_leader(self, infile=None, lineno=None):
+ "Emit a C-compiler-like, Emacs-friendly error-message leader."
+ if infile is None:
+ infile = self.infile
+ if lineno is None:
+ lineno = self.lineno
+ return "\"%s\", line %d: " % (infile, lineno)
+
+ def __iter__(self):
+ return self
+
+ def next(self):
+ token = self.get_token()
+ if token == self.eof:
+ raise StopIteration
+ return token
+
+def split(s, comments=False):
+ lex = shlex(s, posix=True)
+ lex.whitespace_split = True
+ if not comments:
+ lex.commenters = ''
+ #return list(lex)
+ result = []
+ while True:
+ token = lex.get_token()
+ if token == lex.eof:
+ break
+ result.append(token)
+ return result
+
+if __name__ == '__main__':
+ if len(sys.argv) == 1:
+ lexer = shlex()
+ else:
+ file = sys.argv[1]
+ lexer = shlex(open(file), file)
+ while 1:
+ tt = lexer.get_token()
+ if tt:
+ print "Token: " + repr(tt)
+ else:
+ break
diff --git a/src/engine/SCons/cpp.py b/src/engine/SCons/cpp.py
index 8620936..cdd6a3a 100644
--- a/src/engine/SCons/cpp.py
+++ b/src/engine/SCons/cpp.py
@@ -45,11 +45,16 @@ import string
# that we want to fetch, using the regular expressions to which the lists
# of preprocessor directives map.
cpp_lines_dict = {
- # Fetch the rest of a #if/#elif/#ifdef/#ifndef/#import/#include/
- # #include_next line as one argument.
- ('if', 'elif', 'ifdef', 'ifndef', 'import', 'include', 'include_next',)
+ # Fetch the rest of a #if/#elif/#ifdef/#ifndef as one argument,
+ # separated from the keyword by white space.
+ ('if', 'elif', 'ifdef', 'ifndef',)
: '\s+(.+)',
+ # Fetch the rest of a #import/#include/#include_next line as one
+ # argument, with white space optional.
+ ('import', 'include', 'include_next',)
+ : '\s*(.+)',
+
# We don't care what comes after a #else or #endif line.
('else', 'endif',) : '',
@@ -183,7 +188,13 @@ class FunctionEvaluator:
"""
self.name = name
self.args = function_arg_separator.split(args)
- self.expansion = string.split(expansion, '##')
+ try:
+ expansion = string.split(expansion, '##')
+ except (AttributeError, TypeError):
+ # Python 1.5 throws TypeError if "expansion" isn't a string,
+ # later versions throw AttributeError.
+ pass
+ self.expansion = expansion
def __call__(self, *values):
"""
Evaluates the expansion of a #define macro function called
@@ -228,12 +239,14 @@ class PreProcessor:
"""
The main workhorse class for handling C pre-processing.
"""
- def __init__(self, current='.', cpppath=[], dict={}, all=0):
+ def __init__(self, current=os.curdir, cpppath=(), dict={}, all=0):
global Table
+ cpppath = tuple(cpppath)
+
self.searchpath = {
- '"' : [current] + cpppath,
- '<' : cpppath + [current],
+ '"' : (current,) + cpppath,
+ '<' : cpppath + (current,),
}
# Initialize our C preprocessor namespace for tracking the
@@ -254,7 +267,9 @@ class PreProcessor:
# stack and changing what method gets called for each relevant
# directive we might see next at this level (#else, #elif).
# #endif will simply pop the stack.
- d = {}
+ d = {
+ 'scons_current_file' : self.scons_current_file
+ }
for op in Table.keys():
d[op] = getattr(self, 'do_' + op)
self.default_table = d
@@ -278,25 +293,34 @@ class PreProcessor:
(m[0],) + t[m[0]].match(m[1]).groups(),
cpp_tuples)
- def __call__(self, contents):
+ def __call__(self, file):
+ """
+ Pre-processes a file.
+
+ This is the main public entry point.
+ """
+ self.current_file = file
+ return self.process_contents(self.read_file(file), file)
+
+ def process_contents(self, contents, fname=None):
"""
Pre-processes a file contents.
- This is the main entry point, which
+ This is the main internal entry point.
"""
self.stack = []
self.dispatch_table = self.default_table.copy()
+ self.current_file = fname
self.tuples = self.tupleize(contents)
- self.result = []
+ self.initialize_result(fname)
while self.tuples:
t = self.tuples.pop(0)
# Uncomment to see the list of tuples being processed (e.g.,
# to validate the CPP lines are being translated correctly).
#print t
self.dispatch_table[t[0]](t)
-
- return self.result
+ return self.finalize_result(fname)
# Dispatch table stack manipulation methods.
@@ -325,6 +349,9 @@ class PreProcessor:
"""
pass
+ def scons_current_file(self, t):
+ self.current_file = t[1]
+
def eval_expression(self, t):
"""
Evaluates a C preprocessor expression.
@@ -337,17 +364,29 @@ class PreProcessor:
try: return eval(t, self.cpp_namespace)
except (NameError, TypeError): return 0
+ def initialize_result(self, fname):
+ self.result = [fname]
+
+ def finalize_result(self, fname):
+ return self.result[1:]
+
def find_include_file(self, t):
"""
Finds the #include file for a given preprocessor tuple.
"""
fname = t[2]
for d in self.searchpath[t[1]]:
- f = os.path.join(d, fname)
+ if d == os.curdir:
+ f = fname
+ else:
+ f = os.path.join(d, fname)
if os.path.isfile(f):
return f
return None
+ def read_file(self, file):
+ return open(file).read()
+
# Start and stop processing include lines.
def start_handling_includes(self, t=None):
@@ -478,8 +517,10 @@ class PreProcessor:
if include_file:
#print "include_file =", include_file
self.result.append(include_file)
- contents = open(include_file).read()
- new_tuples = self.tupleize(contents)
+ contents = self.read_file(include_file)
+ new_tuples = [('scons_current_file', include_file)] + \
+ self.tupleize(contents) + \
+ [('scons_current_file', self.current_file)]
self.tuples[:] = new_tuples + self.tuples
# Date: Tue, 22 Nov 2005 20:26:09 -0500
diff --git a/src/engine/SCons/cppTests.py b/src/engine/SCons/cppTests.py
index 0959e2c..33fd01d 100644
--- a/src/engine/SCons/cppTests.py
+++ b/src/engine/SCons/cppTests.py
@@ -23,10 +23,10 @@
__revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
+import string
import sys
import unittest
-print sys.path
import cpp
@@ -297,6 +297,9 @@ macro_function_input = """
#include FUNC39c(ZERO, ONE)
#include FUNC40c(ZERO, ONE)
+
+/* Make sure we don't die if the expansion isn't a string. */
+#define FUNC_INTEGER(x) 1
"""
@@ -312,6 +315,12 @@ token_pasting_input = """
"""
+no_space_input = """
+#include<file43-yes>
+#include"file44-yes"
+"""
+
+
# pp_class = PreProcessor
# #pp_class = DumbPreProcessor
@@ -331,55 +340,61 @@ class cppTestCase(unittest.TestCase):
def test_basic(self):
"""Test basic #include scanning"""
expect = self.basic_expect
- result = self.cpp(basic_input)
+ result = self.cpp.process_contents(basic_input)
assert expect == result, (expect, result)
def test_substitution(self):
"""Test substitution of #include files using CPP variables"""
expect = self.substitution_expect
- result = self.cpp(substitution_input)
+ result = self.cpp.process_contents(substitution_input)
assert expect == result, (expect, result)
def test_ifdef(self):
"""Test basic #ifdef processing"""
expect = self.ifdef_expect
- result = self.cpp(ifdef_input)
+ result = self.cpp.process_contents(ifdef_input)
assert expect == result, (expect, result)
def test_if_boolean(self):
"""Test #if with Boolean values"""
expect = self.if_boolean_expect
- result = self.cpp(if_boolean_input)
+ result = self.cpp.process_contents(if_boolean_input)
assert expect == result, (expect, result)
def test_if_defined(self):
"""Test #if defined() idioms"""
expect = self.if_defined_expect
- result = self.cpp(if_defined_input)
+ result = self.cpp.process_contents(if_defined_input)
assert expect == result, (expect, result)
def test_expression(self):
"""Test #if with arithmetic expressions"""
expect = self.expression_expect
- result = self.cpp(expression_input)
+ result = self.cpp.process_contents(expression_input)
assert expect == result, (expect, result)
def test_undef(self):
"""Test #undef handling"""
expect = self.undef_expect
- result = self.cpp(undef_input)
+ result = self.cpp.process_contents(undef_input)
assert expect == result, (expect, result)
def test_macro_function(self):
"""Test using macro functions to express file names"""
expect = self.macro_function_expect
- result = self.cpp(macro_function_input)
+ result = self.cpp.process_contents(macro_function_input)
assert expect == result, (expect, result)
def test_token_pasting(self):
- """Test taken-pasting to construct file names"""
+ """Test token-pasting to construct file names"""
expect = self.token_pasting_expect
- result = self.cpp(token_pasting_input)
+ result = self.cpp.process_contents(token_pasting_input)
+ assert expect == result, (expect, result)
+
+ def test_no_space(self):
+ """Test no space between #include and the quote"""
+ expect = self.no_space_expect
+ result = self.cpp.process_contents(no_space_input)
assert expect == result, (expect, result)
class cppAllTestCase(cppTestCase):
@@ -460,6 +475,11 @@ class PreProcessorTestCase(cppAllTestCase):
('include', '<', 'file42-yes'),
]
+ no_space_expect = [
+ ('include', '<', 'file43-yes'),
+ ('include', '"', 'file44-yes'),
+ ]
+
class DumbPreProcessorTestCase(cppAllTestCase):
cpp_class = cpp.DumbPreProcessor
@@ -560,14 +580,126 @@ class DumbPreProcessorTestCase(cppAllTestCase):
('include', '<', 'file42-yes'),
]
+ no_space_expect = [
+ ('include', '<', 'file43-yes'),
+ ('include', '"', 'file44-yes'),
+ ]
+
+
+
+import os
+import re
+import shutil
+import tempfile
+
+tempfile.template = 'cppTests.'
+if os.name in ('posix', 'nt'):
+ tempfile.template = 'cppTests.' + str(os.getpid()) + '.'
+else:
+ tempfile.template = 'cppTests.'
+
+_Cleanup = []
+
+def _clean():
+ for dir in _Cleanup:
+ if os.path.exists(dir):
+ shutil.rmtree(dir)
+
+sys.exitfunc = _clean
+
+class fileTestCase(unittest.TestCase):
+ cpp_class = cpp.DumbPreProcessor
+
+ def setUp(self):
+ try:
+ path = tempfile.mktemp(prefix=tempfile.template)
+ except TypeError:
+ # The tempfile.mktemp() function in earlier versions of Python
+ # has no prefix argument, but uses the tempfile.template
+ # value that we set above.
+ path = tempfile.mktemp()
+ _Cleanup.append(path)
+ os.mkdir(path)
+ self.tempdir = path
+ self.orig_cwd = os.getcwd()
+ os.chdir(path)
+
+ def tearDown(self):
+ shutil.rmtree(self.tempdir)
+ _Cleanup.remove(self.tempdir)
+ os.chdir(self.orig_cwd)
+
+ def strip_initial_spaces(self, s):
+ #lines = s.split('\n')
+ lines = string.split(s, '\n')
+ spaces = re.match(' *', lines[0]).group(0)
+ def strip_spaces(l, spaces=spaces):
+ #if l.startswith(spaces):
+ if l[:len(spaces)] == spaces:
+ l = l[len(spaces):]
+ return l
+ #return '\n'.join([ strip_spaces(l) for l in lines ])
+ return string.join(map(strip_spaces, lines), '\n')
+
+ def write(self, file, contents):
+ open(file, 'w').write(self.strip_initial_spaces(contents))
+
+ def test_basic(self):
+ """Test basic file inclusion"""
+ self.write('f1.h', """\
+ #include "f2.h"
+ """)
+ self.write('f2.h', """\
+ #include <f3.h>
+ """)
+ self.write('f3.h', """\
+ """)
+ p = cpp.DumbPreProcessor(current = os.curdir,
+ cpppath = [os.curdir])
+ result = p('f1.h')
+ assert result == ['f2.h', 'f3.h'], result
+
+ def test_current_file(self):
+ """Test use of the .current_file attribute"""
+ self.write('f1.h', """\
+ #include <f2.h>
+ """)
+ self.write('f2.h', """\
+ #include "f3.h"
+ """)
+ self.write('f3.h', """\
+ """)
+ class MyPreProcessor(cpp.DumbPreProcessor):
+ def __init__(self, *args, **kw):
+ apply(cpp.DumbPreProcessor.__init__, (self,) + args, kw)
+ self.files = []
+ def __call__(self, file):
+ self.files.append(file)
+ return cpp.DumbPreProcessor.__call__(self, file)
+ def scons_current_file(self, t):
+ r = cpp.DumbPreProcessor.scons_current_file(self, t)
+ self.files.append(self.current_file)
+ return r
+ p = MyPreProcessor(current = os.curdir, cpppath = [os.curdir])
+ result = p('f1.h')
+ assert result == ['f2.h', 'f3.h'], result
+ assert p.files == ['f1.h', 'f2.h', 'f3.h', 'f2.h', 'f1.h'], p.files
+
+
+
if __name__ == '__main__':
suite = unittest.TestSuite()
tclasses = [ PreProcessorTestCase,
DumbPreProcessorTestCase,
+ fileTestCase,
]
for tclass in tclasses:
names = unittest.getTestCaseNames(tclass, 'test_')
+ try:
+ names = list(set(names))
+ except NameError:
+ pass
+ names.sort()
suite.addTests(map(tclass, names))
if not unittest.TextTestRunner().run(suite).wasSuccessful():
sys.exit(1)
-
diff --git a/src/script/scons-time.py b/src/script/scons-time.py
index 284443a..d651ba9 100644
--- a/src/script/scons-time.py
+++ b/src/script/scons-time.py
@@ -38,6 +38,7 @@ __revision__ = "__FILE__ __REVISION__ __DATE__ __DEVELOPER__"
import getopt
import glob
import os
+import os.path
import re
import shutil
import string
@@ -62,6 +63,11 @@ except NameError:
def make_temp_file(**kw):
try:
result = tempfile.mktemp(**kw)
+ try:
+ result = os.path.realpath(result)
+ except AttributeError:
+ # Python 2.1 has no os.path.realpath() method.
+ pass
except TypeError:
try:
save_template = tempfile.template
@@ -122,7 +128,12 @@ class Line:
if self.comment:
print '# %s' % self.comment
for x, y in self.points:
- print fmt % (x, y)
+ # If y is None, it usually represents some kind of break
+ # in the line's index number. We might want to represent
+ # this some way rather than just drawing the line straight
+ # between the two points on either side.
+ if not y is None:
+ print fmt % (x, y)
print 'e'
def get_x_values(self):
@@ -148,47 +159,59 @@ class Gnuplotter(Plotter):
def vertical_bar(self, x, type, label, comment):
if self.get_min_x() <= x and x <= self.get_max_x():
- points = [(x, 0), (x, self.get_max_x())]
+ points = [(x, 0), (x, self.max_graph_value(self.get_max_y()))]
self.line(points, type, label, comment)
def get_all_x_values(self):
result = []
for line in self.lines:
result.extend(line.get_x_values())
- return result
+ return filter(None, result)
def get_all_y_values(self):
result = []
for line in self.lines:
result.extend(line.get_y_values())
- return result
+ return filter(None, result)
def get_min_x(self):
try:
return self.min_x
except AttributeError:
- self.min_x = min(self.get_all_x_values())
+ try:
+ self.min_x = min(self.get_all_x_values())
+ except ValueError:
+ self.min_x = 0
return self.min_x
def get_max_x(self):
try:
return self.max_x
except AttributeError:
- self.max_x = max(self.get_all_x_values())
+ try:
+ self.max_x = max(self.get_all_x_values())
+ except ValueError:
+ self.max_x = 0
return self.max_x
def get_min_y(self):
try:
return self.min_y
except AttributeError:
- self.min_y = min(self.get_all_y_values())
+ try:
+ self.min_y = min(self.get_all_y_values())
+ except ValueError:
+ self.min_y = 0
return self.min_y
def get_max_y(self):
try:
return self.max_y
except AttributeError:
- self.max_y = max(self.get_all_y_values())
+ try:
+ self.max_y = max(self.get_all_y_values())
+ except ValueError:
+ self.max_y = 0
return self.max_y
def draw(self):
@@ -200,10 +223,17 @@ class Gnuplotter(Plotter):
print 'set title "%s"' % self.title
print 'set key %s' % self.key_location
+ min_y = self.get_min_y()
+ max_y = self.max_graph_value(self.get_max_y())
+ range = max_y - min_y
+ incr = range / 10.0
+ start = min_y + (max_y / 2.0) + (2.0 * incr)
+ position = [ start - (i * incr) for i in xrange(5) ]
+
inx = 1
- max_y = self.max_graph_value(self.get_max_y())/2
for line in self.lines:
- line.print_label(inx, line.points[0][0]-1, max_y)
+ line.print_label(inx, line.points[0][0]-1,
+ position[(inx-1) % len(position)])
inx += 1
plot_strings = [ self.plot_string(l) for l in self.lines ]
@@ -485,6 +515,8 @@ class SConsTimer:
for file in files:
t = line_function(file, *args, **kw)
+ if t is None:
+ t = []
diff = len(columns) - len(t)
if diff > 0:
t += [''] * diff
@@ -564,7 +596,7 @@ class SConsTimer:
except ValueError:
x, type, label = bar_tuple
comment = label
- gp.vertical_bar(x, type, None, label, comment)
+ gp.vertical_bar(x, type, label, comment)
gp.draw()
@@ -613,10 +645,17 @@ class SConsTimer:
else:
search_string = time_string
contents = open(file).read()
+ if not contents:
+ sys.stderr.write('file %s has no contents!\n' % repr(file))
+ return None
result = re.findall(r'%s: ([\d\.]*)' % search_string, contents)[-4:]
result = [ float(r) for r in result ]
if not time_string is None:
- result = result[0]
+ try:
+ result = result[0]
+ except IndexError:
+ sys.stderr.write('file %s has no results!\n' % repr(file))
+ return None
return result
def get_function_profile(self, file, function):
@@ -1294,7 +1333,12 @@ class SConsTimer:
commands.append((shutil.copytree, 'cp -r %%s %%s', archive, dest))
else:
suffix = self.archive_splitext(archive)[1]
- commands.append(self.unpack_map[suffix] + (archive,))
+ unpack_command = self.unpack_map.get(suffix)
+ if not unpack_command:
+ dest = os.path.split(archive)[1]
+ commands.append((shutil.copyfile, 'cp %%s %%s', archive, dest))
+ else:
+ commands.append(unpack_command + (archive,))
commands.extend([
(os.chdir, 'cd %%s', self.subdir),
diff --git a/src/test_strings.py b/src/test_strings.py
index 0d6e6ac..c446cad 100644
--- a/src/test_strings.py
+++ b/src/test_strings.py
@@ -127,6 +127,7 @@ check_list = [
'engine/SCons/compat/_scons_optparse.py',
'engine/SCons/compat/_scons_sets.py',
'engine/SCons/compat/_scons_sets15.py',
+ 'engine/SCons/compat/_scons_shlex.py',
'engine/SCons/compat/_scons_subprocess.py',
'engine/SCons/compat/_scons_textwrap.py',
'engine/SCons/Conftest.py',
@@ -151,6 +152,7 @@ check_list = [
'engine/SCons/compat/_scons_optparse.py',
'engine/SCons/compat/_scons_sets.py',
'engine/SCons/compat/_scons_sets15.py',
+ 'engine/SCons/compat/_scons_shlex.py',
'engine/SCons/compat/_scons_subprocess.py',
'engine/SCons/compat/_scons_textwrap.py',
'engine/SCons/Conftest.py',
@@ -171,6 +173,7 @@ check_list = [
'SCons/compat/_scons_optparse.py',
'SCons/compat/_scons_sets.py',
'SCons/compat/_scons_sets15.py',
+ 'SCons/compat/_scons_shlex.py',
'SCons/compat/_scons_subprocess.py',
'SCons/compat/_scons_textwrap.py',
'SCons/Conftest.py',
@@ -214,6 +217,7 @@ check_list = [
'src/engine/SCons/compat/_scons_optparse.py',
'src/engine/SCons/compat/_scons_sets.py',
'src/engine/SCons/compat/_scons_sets15.py',
+ 'src/engine/SCons/compat/_scons_shlex.py',
'src/engine/SCons/compat/_scons_subprocess.py',
'src/engine/SCons/compat/_scons_textwrap.py',
'src/engine/SCons/Conftest.py',