"time gcc -MM file"
This is similar to -Eonly.
-//===---------------------------------------------------------------------===//
-
-Creating and using a PTH file for performance measurement (use a release build).
-
-$ clang -ccc-pch-is-pth -x objective-c-header INPUTS/Cocoa_h.m -o /tmp/tokencache
-$ clang -cc1 -token-cache /tmp/tokencache INPUTS/Cocoa_h.m
-
//===---------------------------------------------------------------------===//
C++ Template Instantiation benchmark:
Precompiled Headers
===================
-Clang supports two implementations of precompiled headers. The default
-implementation, precompiled headers (:doc:`PCH <PCHInternals>`) uses a
+Clang supports precompiled headers (:doc:`PCH <PCHInternals>`), which uses a
serialized representation of Clang's internal data structures, encoded with the
`LLVM bitstream format <https://llvm.org/docs/BitCodeFormat.html>`_.
-Pretokenized headers (:doc:`PTH <PTHInternals>`), on the other hand, contain a
-serialized representation of the tokens encountered when preprocessing a header
-(and anything that header includes).
The Frontend Library
====================
+++ /dev/null
-==========================
-Pretokenized Headers (PTH)
-==========================
-
-This document first describes the low-level interface for using PTH and
-then briefly elaborates on its design and implementation. If you are
-interested in the end-user view, please see the :ref:`User's Manual
-<usersmanual-precompiled-headers>`.
-
-Using Pretokenized Headers with ``clang`` (Low-level Interface)
-===============================================================
-
-The Clang compiler frontend, ``clang -cc1``, supports three command line
-options for generating and using PTH files.
-
-To generate PTH files using ``clang -cc1``, use the option ``-emit-pth``:
-
-.. code-block:: console
-
- $ clang -cc1 test.h -emit-pth -o test.h.pth
-
-This option is transparently used by ``clang`` when generating PTH
-files. Similarly, PTH files can be used as prefix headers using the
-``-include-pth`` option:
-
-.. code-block:: console
-
- $ clang -cc1 -include-pth test.h.pth test.c -o test.s
-
-Alternatively, Clang's PTH files can be used as a raw "token-cache" (or
-"content" cache) of the source included by the original header file.
-This means that the contents of the PTH file are searched as substitutes
-for *any* source files that are used by ``clang -cc1`` to process a
-source file. This is done by specifying the ``-token-cache`` option:
-
-.. code-block:: console
-
- $ cat test.h
- #include <stdio.h>
- $ clang -cc1 -emit-pth test.h -o test.h.pth
- $ cat test.c
- #include "test.h"
- $ clang -cc1 test.c -o test -token-cache test.h.pth
-
-In this example the contents of ``stdio.h`` (and the files it includes)
-will be retrieved from ``test.h.pth``, as the PTH file is being used in
-this case as a raw cache of the contents of ``test.h``. This is a
-low-level interface used to both implement the high-level PTH interface
-as well as to provide alternative means to use PTH-style caching.
-
-PTH Design and Implementation
-=============================
-
-Unlike GCC's precompiled headers, which cache the full ASTs and
-preprocessor state of a header file, Clang's pretokenized header files
-mainly cache the raw lexer *tokens* that are needed to segment the
-stream of characters in a source file into keywords, identifiers, and
-operators. Consequently, PTH serves to mainly directly speed up the
-lexing and preprocessing of a source file, while parsing and
-type-checking must be completely redone every time a PTH file is used.
-
-Basic Design Tradeoffs
-----------------------
-
-In the long term there are plans to provide an alternate PCH
-implementation for Clang that also caches the work for parsing and type
-checking the contents of header files. The current implementation of PCH
-in Clang as pretokenized header files was motivated by the following
-factors:
-
-**Language independence**
- PTH files work with any language that
- Clang's lexer can handle, including C, Objective-C, and (in the early
- stages) C++. This means development on language features at the
- parsing level or above (which is basically almost all interesting
- pieces) does not require PTH to be modified.
-
-**Simple design**
- Relatively speaking, PTH has a simple design and
- implementation, making it easy to test. Further, because the
- machinery for PTH resides at the lower-levels of the Clang library
- stack it is fairly straightforward to profile and optimize.
-
-Further, compared to GCC's PCH implementation (which is the dominate
-precompiled header file implementation that Clang can be directly
-compared against) the PTH design in Clang yields several attractive
-features:
-
-**Architecture independence**
- In contrast to GCC's PCH files (and
- those of several other compilers), Clang's PTH files are architecture
- independent, requiring only a single PTH file when building a
- program for multiple architectures.
-
- For example, on Mac OS X one may wish to compile a "universal binary"
- that runs on PowerPC, 32-bit Intel (i386), and 64-bit Intel
- architectures. In contrast, GCC requires a PCH file for each
- architecture, as the definitions of types in the AST are
- architecture-specific. Since a Clang PTH file essentially represents
- a lexical cache of header files, a single PTH file can be safely used
- when compiling for multiple architectures. This can also reduce
- compile times because only a single PTH file needs to be generated
- during a build instead of several.
-
-**Reduced memory pressure**
- Similar to GCC, Clang reads PTH files
- via the use of memory mapping (i.e., ``mmap``). Clang, however,
- memory maps PTH files as read-only, meaning that multiple invocations
- of ``clang -cc1`` can share the same pages in memory from a
- memory-mapped PTH file. In comparison, GCC also memory maps its PCH
- files but also modifies those pages in memory, incurring the
- copy-on-write costs. The read-only nature of PTH can greatly reduce
- memory pressure for builds involving multiple cores, thus improving
- overall scalability.
-
-**Fast generation**
- PTH files can be generated in a small fraction
- of the time needed to generate GCC's PCH files. Since PTH/PCH
- generation is a serial operation that typically blocks progress
- during a build, faster generation time leads to improved processor
- utilization with parallel builds on multicore machines.
-
-Despite these strengths, PTH's simple design suffers some algorithmic
-handicaps compared to other PCH strategies such as those used by GCC.
-While PTH can greatly speed up the processing time of a header file, the
-amount of work required to process a header file is still roughly linear
-in the size of the header file. In contrast, the amount of work done by
-GCC to process a precompiled header is (theoretically) constant (the
-ASTs for the header are literally memory mapped into the compiler). This
-means that only the pieces of the header file that are referenced by the
-source file including the header are the only ones the compiler needs to
-process during actual compilation. While GCC's particular implementation
-of PCH mitigates some of these algorithmic strengths via the use of
-copy-on-write pages, the approach itself can fundamentally dominate at
-an algorithmic level, especially when one considers header files of
-arbitrary size.
-
-There is also a PCH implementation for Clang based on the lazy
-deserialization of ASTs. This approach theoretically has the same
-constant-time algorithmic advantages just mentioned but also retains some
-of the strengths of PTH such as reduced memory pressure (ideal for
-multi-core builds).
-
-Internal PTH Optimizations
---------------------------
-
-While the main optimization employed by PTH is to reduce lexing time of
-header files by caching pre-lexed tokens, PTH also employs several other
-optimizations to speed up the processing of header files:
-
-- ``stat`` caching: PTH files cache information obtained via calls to
- ``stat`` that ``clang -cc1`` uses to resolve which files are included
- by ``#include`` directives. This greatly reduces the overhead
- involved in context-switching to the kernel to resolve included
- files.
-
-- Fast skipping of ``#ifdef`` ... ``#endif`` chains: PTH files
- record the basic structure of nested preprocessor blocks. When the
- condition of the preprocessor block is false, all of its tokens are
- immediately skipped instead of requiring them to be handled by
- Clang's preprocessor.
-
-
Non-comprehensive list of changes in this release
-------------------------------------------------
-- ...
+- The experimental feature Pretokenized Headers (PTH) was removed in its
+ entirely from Clang. The feature did not properly work with about 1/3 of the
+ possible tokens available and was unmaintained.
New Compiler Flags
------------------
"unable to interface with target machine">;
def err_fe_unable_to_open_output : Error<
"unable to open output file '%0': '%1'">;
-def err_fe_pth_file_has_no_source_header : Error<
- "PTH file '%0' does not designate an original source header file for -include-pth">;
def warn_fe_macro_contains_embedded_newline : Warning<
"macro '%0' contains embedded newline; text after the newline is ignored">;
def warn_fe_cc_print_header_failure : Warning<
def err_lexing_string : Error<"failure when lexing a string">;
def err_placeholder_in_source : Error<"editor placeholder in source file">;
-
-//===----------------------------------------------------------------------===//
-// PTH Diagnostics
-//===----------------------------------------------------------------------===//
-def err_invalid_pth_file : Error<
- "invalid or corrupt PTH file '%0'">;
-
//===----------------------------------------------------------------------===//
// Preprocessor Diagnostics
//===----------------------------------------------------------------------===//
"'#pragma clang module end'">;
def note_pp_module_begin_here : Note<
"entering module '%0' due to this pragma">;
-def err_pp_module_build_pth : Error<
- "'#pragma clang module build' not supported in pretokenized header">;
def err_pp_module_build_missing_end : Error<
"no matching '#pragma clang module endbuild' for this '#pragma clang module build'">;
HelpText<"Generate pre-compiled module file from a C++ module interface">;
def emit_header_module : Flag<["-"], "emit-header-module">,
HelpText<"Generate pre-compiled module file from a set of header files">;
-def emit_pth : Flag<["-"], "emit-pth">,
- HelpText<"Generate pre-tokenized header file">;
def emit_pch : Flag<["-"], "emit-pch">,
HelpText<"Generate pre-compiled header file">;
def emit_llvm_bc : Flag<["-"], "emit-llvm-bc">,
// Preprocessor Options
//===----------------------------------------------------------------------===//
-def include_pth : Separate<["-"], "include-pth">, MetaVarName<"<file>">,
- HelpText<"Include file before parsing">;
def chain_include : Separate<["-"], "chain-include">, MetaVarName<"<file>">,
HelpText<"Include and chain a header file after turning it into PCH">;
def preamble_bytes_EQ : Joined<["-"], "preamble-bytes=">,
HelpText<"Assume that the precompiled header is a precompiled preamble "
"covering the first N bytes of the main file">;
-def token_cache : Separate<["-"], "token-cache">, MetaVarName<"<path>">,
- HelpText<"Use specified token cache file">;
def detailed_preprocessing_record : Flag<["-"], "detailed-preprocessing-record">,
HelpText<"include a detailed record of preprocessing actions">;
unsigned CheckInputsExist : 1;
public:
- /// Use lazy precompiled headers for PCH support.
- unsigned CCCUsePCH : 1;
-
/// Force clang to emit reproducer for driver invocation. This is enabled
/// indirectly by setting FORCE_CLANG_DIAGNOSTICS_CRASH environment variable
/// or when using the -gen-reproducer driver flag.
def ccc_gcc_name : Separate<["-"], "ccc-gcc-name">, InternalDriverOpt,
HelpText<"Name for native GCC compiler">,
MetaVarName<"<gcc-path>">;
-def ccc_pch_is_pch : Flag<["-"], "ccc-pch-is-pch">, InternalDriverOpt,
- HelpText<"Use lazy PCH for precompiled headers">;
-def ccc_pch_is_pth : Flag<["-"], "ccc-pch-is-pth">, InternalDriverOpt,
- HelpText<"Use pretokenized headers for precompiled headers">;
class InternalDebugOpt : Group<internal_debug_Group>,
Flags<[DriverOption, HelpHidden, CoreOption]>;
void ExecuteAction() override;
};
-class GeneratePTHAction : public PreprocessorFrontendAction {
-protected:
- void ExecuteAction() override;
-};
-
class PreprocessOnlyAction : public PreprocessorFrontendAction {
protected:
void ExecuteAction() override;
/// Generate pre-compiled header.
GeneratePCH,
- /// Generate pre-tokenized header.
- GeneratePTH,
-
/// Only execute frontend initialization.
InitOnly,
StringRef OutputPath = {},
bool ShowDepth = true, bool MSStyle = false);
-/// Cache tokens for use with PCH. Note that this requires a seekable stream.
-void CacheTokens(Preprocessor &PP, raw_pwrite_stream *OS);
-
/// The ChainedIncludesSource class converts headers to chained PCHs in
/// memory, mainly for testing.
IntrusiveRefCntPtr<ExternalSemaSource>
+++ /dev/null
-//===- PTHLexer.h - Lexer based on Pre-tokenized input ----------*- C++ -*-===//
-//
-// The LLVM Compiler Infrastructure
-//
-// This file is distributed under the University of Illinois Open Source
-// License. See LICENSE.TXT for details.
-//
-//===----------------------------------------------------------------------===//
-//
-// This file defines the PTHLexer interface.
-//
-//===----------------------------------------------------------------------===//
-
-#ifndef LLVM_CLANG_LEX_PTHLEXER_H
-#define LLVM_CLANG_LEX_PTHLEXER_H
-
-#include "clang/Basic/SourceLocation.h"
-#include "clang/Basic/TokenKinds.h"
-#include "clang/Lex/PreprocessorLexer.h"
-#include "clang/Lex/Token.h"
-
-namespace clang {
-
-class Preprocessor;
-class PTHManager;
-
-class PTHLexer : public PreprocessorLexer {
- SourceLocation FileStartLoc;
-
- /// TokBuf - Buffer from PTH file containing raw token data.
- const unsigned char* TokBuf;
-
- /// CurPtr - Pointer into current offset of the token buffer where
- /// the next token will be read.
- const unsigned char* CurPtr;
-
- /// LastHashTokPtr - Pointer into TokBuf of the last processed '#'
- /// token that appears at the start of a line.
- const unsigned char* LastHashTokPtr = nullptr;
-
- /// PPCond - Pointer to a side table in the PTH file that provides a
- /// a concise summary of the preprocessor conditional block structure.
- /// This is used to perform quick skipping of conditional blocks.
- const unsigned char* PPCond;
-
- /// CurPPCondPtr - Pointer inside PPCond that refers to the next entry
- /// to process when doing quick skipping of preprocessor blocks.
- const unsigned char* CurPPCondPtr;
-
- /// ReadToken - Used by PTHLexer to read tokens TokBuf.
- void ReadToken(Token &T);
-
- bool LexEndOfFile(Token &Result);
-
- /// PTHMgr - The PTHManager object that created this PTHLexer.
- PTHManager& PTHMgr;
-
- Token EofToken;
-
-protected:
- friend class PTHManager;
-
- /// Create a PTHLexer for the specified token stream.
- PTHLexer(Preprocessor &pp, FileID FID, const unsigned char *D,
- const unsigned char* ppcond, PTHManager &PM);
-
-public:
- PTHLexer(const PTHLexer &) = delete;
- PTHLexer &operator=(const PTHLexer &) = delete;
- ~PTHLexer() override = default;
-
- /// Lex - Return the next token.
- bool Lex(Token &Tok);
-
- void getEOF(Token &Tok);
-
- /// DiscardToEndOfLine - Read the rest of the current preprocessor line as an
- /// uninterpreted string. This switches the lexer out of directive mode.
- void DiscardToEndOfLine();
-
- /// isNextPPTokenLParen - Return 1 if the next unexpanded token will return a
- /// tok::l_paren token, 0 if it is something else and 2 if there are no more
- /// tokens controlled by this lexer.
- unsigned isNextPPTokenLParen() {
- // isNextPPTokenLParen is not on the hot path, and all we care about is
- // whether or not we are at a token with kind tok::eof or tok::l_paren.
- // Just read the first byte from the current token pointer to determine
- // its kind.
- tok::TokenKind x = (tok::TokenKind)*CurPtr;
- return x == tok::eof ? 2 : x == tok::l_paren;
- }
-
- /// IndirectLex - An indirect call to 'Lex' that can be invoked via
- /// the PreprocessorLexer interface.
- void IndirectLex(Token &Result) override { Lex(Result); }
-
- /// getSourceLocation - Return a source location for the token in
- /// the current file.
- SourceLocation getSourceLocation() override;
-
- /// SkipBlock - Used by Preprocessor to skip the current conditional block.
- bool SkipBlock();
-};
-
-} // namespace clang
-
-#endif // LLVM_CLANG_LEX_PTHLEXER_H
+++ /dev/null
-//===- PTHManager.h - Manager object for PTH processing ---------*- C++ -*-===//
-//
-// The LLVM Compiler Infrastructure
-//
-// This file is distributed under the University of Illinois Open Source
-// License. See LICENSE.TXT for details.
-//
-//===----------------------------------------------------------------------===//
-//
-// This file defines the PTHManager interface.
-//
-//===----------------------------------------------------------------------===//
-
-#ifndef LLVM_CLANG_LEX_PTHMANAGER_H
-#define LLVM_CLANG_LEX_PTHMANAGER_H
-
-#include "clang/Basic/IdentifierTable.h"
-#include "clang/Basic/SourceLocation.h"
-#include "llvm/ADT/STLExtras.h"
-#include "llvm/ADT/StringRef.h"
-#include "llvm/Support/Allocator.h"
-#include "llvm/Support/OnDiskHashTable.h"
-#include <memory>
-
-namespace llvm {
-
-class MemoryBuffer;
-
-} // namespace llvm
-
-namespace clang {
-
-class DiagnosticsEngine;
-class FileSystemStatCache;
-class Preprocessor;
-class PTHLexer;
-
-class PTHManager : public IdentifierInfoLookup {
- friend class PTHLexer;
- friend class PTHStatCache;
-
- class PTHFileLookupTrait;
- class PTHStringLookupTrait;
-
- using PTHStringIdLookup = llvm::OnDiskChainedHashTable<PTHStringLookupTrait>;
- using PTHFileLookup = llvm::OnDiskChainedHashTable<PTHFileLookupTrait>;
-
- /// The memory mapped PTH file.
- std::unique_ptr<const llvm::MemoryBuffer> Buf;
-
- /// Alloc - Allocator used for IdentifierInfo objects.
- llvm::BumpPtrAllocator Alloc;
-
- /// IdMap - A lazily generated cache mapping from persistent identifiers to
- /// IdentifierInfo*.
- std::unique_ptr<IdentifierInfo *[], llvm::FreeDeleter> PerIDCache;
-
- /// FileLookup - Abstract data structure used for mapping between files
- /// and token data in the PTH file.
- std::unique_ptr<PTHFileLookup> FileLookup;
-
- /// IdDataTable - Array representing the mapping from persistent IDs to the
- /// data offset within the PTH file containing the information to
- /// reconsitute an IdentifierInfo.
- const unsigned char* const IdDataTable;
-
- /// SortedIdTable - Abstract data structure mapping from strings to
- /// persistent IDs. This is used by get().
- std::unique_ptr<PTHStringIdLookup> StringIdLookup;
-
- /// NumIds - The number of identifiers in the PTH file.
- const unsigned NumIds;
-
- /// PP - The Preprocessor object that will use this PTHManager to create
- /// PTHLexer objects.
- Preprocessor* PP = nullptr;
-
- /// SpellingBase - The base offset within the PTH memory buffer that
- /// contains the cached spellings for literals.
- const unsigned char* const SpellingBase;
-
- /// OriginalSourceFile - A null-terminated C-string that specifies the name
- /// if the file (if any) that was to used to generate the PTH cache.
- const char* OriginalSourceFile;
-
- /// This constructor is intended to only be called by the static 'Create'
- /// method.
- PTHManager(std::unique_ptr<const llvm::MemoryBuffer> buf,
- std::unique_ptr<PTHFileLookup> fileLookup,
- const unsigned char *idDataTable,
- std::unique_ptr<IdentifierInfo *[], llvm::FreeDeleter> perIDCache,
- std::unique_ptr<PTHStringIdLookup> stringIdLookup, unsigned numIds,
- const unsigned char *spellingBase, const char *originalSourceFile);
-
- /// getSpellingAtPTHOffset - Used by PTHLexer classes to get the cached
- /// spelling for a token.
- unsigned getSpellingAtPTHOffset(unsigned PTHOffset, const char*& Buffer);
-
- /// GetIdentifierInfo - Used to reconstruct IdentifierInfo objects from the
- /// PTH file.
- IdentifierInfo *GetIdentifierInfo(unsigned PersistentID) {
- // Check if the IdentifierInfo has already been resolved.
- if (IdentifierInfo* II = PerIDCache[PersistentID])
- return II;
- return LazilyCreateIdentifierInfo(PersistentID);
- }
- IdentifierInfo* LazilyCreateIdentifierInfo(unsigned PersistentID);
-
-public:
- // The current PTH version.
- enum { Version = 10 };
-
- PTHManager(const PTHManager &) = delete;
- PTHManager &operator=(const PTHManager &) = delete;
- ~PTHManager() override;
-
- /// getOriginalSourceFile - Return the full path to the original header
- /// file name that was used to generate the PTH cache.
- const char* getOriginalSourceFile() const {
- return OriginalSourceFile;
- }
-
- /// get - Return the identifier token info for the specified named identifier.
- /// Unlike the version in IdentifierTable, this returns a pointer instead
- /// of a reference. If the pointer is nullptr then the IdentifierInfo cannot
- /// be found.
- IdentifierInfo *get(StringRef Name) override;
-
- /// Create - This method creates PTHManager objects. The 'file' argument
- /// is the name of the PTH file. This method returns nullptr upon failure.
- static PTHManager *Create(StringRef file, DiagnosticsEngine &Diags);
-
- void setPreprocessor(Preprocessor *pp) { PP = pp; }
-
- /// CreateLexer - Return a PTHLexer that "lexes" the cached tokens for the
- /// specified file. This method returns nullptr if no cached tokens exist.
- /// It is the responsibility of the caller to 'delete' the returned object.
- PTHLexer *CreateLexer(FileID FID);
-
- /// createStatCache - Returns a FileSystemStatCache object for use with
- /// FileManager objects. These objects use the PTH data to speed up
- /// calls to stat by memoizing their results from when the PTH file
- /// was generated.
- std::unique_ptr<FileSystemStatCache> createStatCache();
-};
-
-} // namespace clang
-
-#endif // LLVM_CLANG_LEX_PTHMANAGER_H
#include "clang/Lex/ModuleLoader.h"
#include "clang/Lex/ModuleMap.h"
#include "clang/Lex/PPCallbacks.h"
-#include "clang/Lex/PTHLexer.h"
#include "clang/Lex/Token.h"
#include "clang/Lex/TokenLexer.h"
#include "llvm/ADT/ArrayRef.h"
class PreprocessingRecord;
class PreprocessorLexer;
class PreprocessorOptions;
-class PTHManager;
class ScratchBuffer;
class TargetInfo;
/// External source of macros.
ExternalPreprocessorSource *ExternalSource;
- /// An optional PTHManager object used for getting tokens from
- /// a token cache rather than lexing the original source file.
- std::unique_ptr<PTHManager> PTH;
-
/// A BumpPtrAllocator object used to quickly allocate and release
/// objects internal to the Preprocessor.
llvm::BumpPtrAllocator BP;
/// The current top of the stack that we're lexing from if
/// not expanding a macro and we are lexing directly from source code.
///
- /// Only one of CurLexer, CurPTHLexer, or CurTokenLexer will be non-null.
+ /// Only one of CurLexer, or CurTokenLexer will be non-null.
std::unique_ptr<Lexer> CurLexer;
- /// The current top of stack that we're lexing from if
- /// not expanding from a macro and we are lexing from a PTH cache.
- ///
- /// Only one of CurLexer, CurPTHLexer, or CurTokenLexer will be non-null.
- std::unique_ptr<PTHLexer> CurPTHLexer;
-
/// The current top of the stack what we're lexing from
/// if not expanding a macro.
///
- /// This is an alias for either CurLexer or CurPTHLexer.
+ /// This is an alias for CurLexer.
PreprocessorLexer *CurPPLexer = nullptr;
/// Used to find the current FileEntry, if CurLexer is non-null
/// The kind of lexer we're currently working with.
enum CurLexerKind {
CLK_Lexer,
- CLK_PTHLexer,
CLK_TokenLexer,
CLK_CachingLexer,
CLK_LexAfterModuleImport
enum CurLexerKind CurLexerKind;
Module *TheSubmodule;
std::unique_ptr<Lexer> TheLexer;
- std::unique_ptr<PTHLexer> ThePTHLexer;
PreprocessorLexer *ThePPLexer;
std::unique_ptr<TokenLexer> TheTokenLexer;
const DirectoryLookup *TheDirLookup;
// versions, only needed to pacify MSVC.
IncludeStackInfo(enum CurLexerKind CurLexerKind, Module *TheSubmodule,
std::unique_ptr<Lexer> &&TheLexer,
- std::unique_ptr<PTHLexer> &&ThePTHLexer,
PreprocessorLexer *ThePPLexer,
std::unique_ptr<TokenLexer> &&TheTokenLexer,
const DirectoryLookup *TheDirLookup)
: CurLexerKind(std::move(CurLexerKind)),
TheSubmodule(std::move(TheSubmodule)), TheLexer(std::move(TheLexer)),
- ThePTHLexer(std::move(ThePTHLexer)),
ThePPLexer(std::move(ThePPLexer)),
TheTokenLexer(std::move(TheTokenLexer)),
TheDirLookup(std::move(TheDirLookup)) {}
Builtin::Context &getBuiltinInfo() { return BuiltinInfo; }
llvm::BumpPtrAllocator &getPreprocessorAllocator() { return BP; }
- void setPTHManager(PTHManager* pm);
-
- PTHManager *getPTHManager() { return PTH.get(); }
-
void setExternalSource(ExternalPreprocessorSource *Source) {
ExternalSource = Source;
}
CachedTokens[CachedLexPos-1] = Tok;
}
- /// Recompute the current lexer kind based on the CurLexer/CurPTHLexer/
+ /// Recompute the current lexer kind based on the CurLexer/
/// CurTokenLexer pointers.
void recomputeCurLexerKind();
void PushIncludeMacroStack() {
assert(CurLexerKind != CLK_CachingLexer && "cannot push a caching lexer");
IncludeMacroStack.emplace_back(CurLexerKind, CurLexerSubmodule,
- std::move(CurLexer), std::move(CurPTHLexer),
- CurPPLexer, std::move(CurTokenLexer),
- CurDirLookup);
+ std::move(CurLexer), CurPPLexer,
+ std::move(CurTokenLexer), CurDirLookup);
CurPPLexer = nullptr;
}
void PopIncludeMacroStack() {
CurLexer = std::move(IncludeMacroStack.back().TheLexer);
- CurPTHLexer = std::move(IncludeMacroStack.back().ThePTHLexer);
CurPPLexer = IncludeMacroStack.back().ThePPLexer;
CurTokenLexer = std::move(IncludeMacroStack.back().TheTokenLexer);
CurDirLookup = IncludeMacroStack.back().TheDirLookup;
bool FoundNonSkipPortion, bool FoundElse,
SourceLocation ElseLoc = SourceLocation());
- /// A fast PTH version of SkipExcludedConditionalBlock.
- void PTHSkipExcludedConditionalBlock();
-
/// Information about the result for evaluating an expression for a
/// preprocessor directive.
struct DirectiveEvalResult {
/// start lexing tokens from it instead of the current buffer.
void EnterSourceFileWithLexer(Lexer *TheLexer, const DirectoryLookup *Dir);
- /// Add a lexer to the top of the include stack and
- /// start getting tokens from it using the PTH cache.
- void EnterSourceFileWithPTH(PTHLexer *PL, const DirectoryLookup *Dir);
-
/// Set the FileID for the preprocessor predefines.
void setPredefinesFileID(FileID FID) {
assert(PredefinesFileID.isInvalid() && "PredefinesFileID already set!");
bool InCachingLexMode() const {
// If the Lexer pointers are 0 and IncludeMacroStack is empty, it means
// that we are past EOF, not that we are in CachingLex mode.
- return !CurPPLexer && !CurTokenLexer && !CurPTHLexer &&
- !IncludeMacroStack.empty();
+ return !CurPPLexer && !CurTokenLexer && !IncludeMacroStack.empty();
}
void EnterCachingLexMode();
/// clients don't use them.
bool WriteCommentListToPCH = true;
- /// The implicit PTH input included at the start of the translation unit, or
- /// empty.
- std::string ImplicitPTHInclude;
-
- /// If given, a PTH cache file to use for speeding up header parsing.
- std::string TokenCache;
-
/// When enabled, preprocessor is in a mode for parsing a single file only.
///
/// Disables #includes of other files and if there are unresolved identifiers
ChainedIncludes.clear();
DumpDeserializedPCHDecls = false;
ImplicitPCHInclude.clear();
- ImplicitPTHInclude.clear();
- TokenCache.clear();
SingleFileParseMode = false;
LexEditorPlaceholders = true;
RetainRemappedFileBuffers = true;
PPOpts.Includes.insert(PPOpts.Includes.begin(), OriginalFile);
PPOpts.ImplicitPCHInclude.clear();
}
- // FIXME: Get the original header of a PTH as well.
- CInvok->getPreprocessorOpts().ImplicitPTHInclude.clear();
std::string define = getARCMTMacroName();
define += '=';
CInvok->getPreprocessorOpts().addMacroDef(define);
CCPrintOptions(false), CCPrintHeaders(false), CCLogDiagnostics(false),
CCGenDiagnostics(false), TargetTriple(TargetTriple),
CCCGenericGCCName(""), Saver(Alloc), CheckInputsExist(true),
- CCCUsePCH(true), GenReproducer(false),
- SuppressMissingInputWarning(false) {
+ GenReproducer(false), SuppressMissingInputWarning(false) {
// Provide a sane fallback if no VFS is specified.
if (!this->VFS)
CCCPrintBindings = Args.hasArg(options::OPT_ccc_print_bindings);
if (const Arg *A = Args.getLastArg(options::OPT_ccc_gcc_name))
CCCGenericGCCName = A->getValue();
- CCCUsePCH =
- Args.hasFlag(options::OPT_ccc_pch_is_pch, options::OPT_ccc_pch_is_pth);
GenReproducer = Args.hasFlag(options::OPT_gen_reproducer,
options::OPT_fno_crash_diagnostics,
!!::getenv("FORCE_CLANG_DIAGNOSTICS_CRASH"));
bool IsFirstImplicitInclude = !RenderedImplicitInclude;
RenderedImplicitInclude = true;
- // Use PCH if the user requested it.
- bool UsePCH = D.CCCUsePCH;
-
- bool FoundPTH = false;
bool FoundPCH = false;
SmallString<128> P(A->getValue());
// We want the files to have a name like foo.h.pch. Add a dummy extension
// so that replace_extension does the right thing.
P += ".dummy";
- if (UsePCH) {
- llvm::sys::path::replace_extension(P, "pch");
- if (llvm::sys::fs::exists(P))
- FoundPCH = true;
- }
+ llvm::sys::path::replace_extension(P, "pch");
+ if (llvm::sys::fs::exists(P))
+ FoundPCH = true;
if (!FoundPCH) {
- llvm::sys::path::replace_extension(P, "pth");
- if (llvm::sys::fs::exists(P))
- FoundPTH = true;
- }
-
- if (!FoundPCH && !FoundPTH) {
llvm::sys::path::replace_extension(P, "gch");
if (llvm::sys::fs::exists(P)) {
- FoundPCH = UsePCH;
- FoundPTH = !UsePCH;
+ FoundPCH = true;
}
}
- if (FoundPCH || FoundPTH) {
+ if (FoundPCH) {
if (IsFirstImplicitInclude) {
A->claim();
- if (UsePCH)
- CmdArgs.push_back("-include-pch");
- else
- CmdArgs.push_back("-include-pth");
+ CmdArgs.push_back("-include-pch");
CmdArgs.push_back(Args.MakeArgString(P));
continue;
} else {
// Also ignore explicit -force_cpusubtype_ALL option.
(void)Args.hasArg(options::OPT_force__cpusubtype__ALL);
} else if (isa<PrecompileJobAction>(JA)) {
- // Use PCH if the user requested it.
- bool UsePCH = D.CCCUsePCH;
-
if (JA.getType() == types::TY_Nothing)
CmdArgs.push_back("-fsyntax-only");
else if (JA.getType() == types::TY_ModuleFile)
CmdArgs.push_back(IsHeaderModulePrecompile
? "-emit-header-module"
: "-emit-module-interface");
- else if (UsePCH)
- CmdArgs.push_back("-emit-pch");
else
- CmdArgs.push_back("-emit-pth");
+ CmdArgs.push_back("-emit-pch");
} else if (isa<VerifyPCHJobAction>(JA)) {
CmdArgs.push_back("-verify-pch");
} else {
// Claim some arguments which clang supports automatically.
// -fpch-preprocess is used with gcc to add a special marker in the output to
- // include the PCH file. Clang's PTH solution is completely transparent, so we
- // do not need to deal with it at all.
+ // include the PCH file.
Args.ClaimAllArgs(options::OPT_fpch_preprocess);
// Claim some arguments which clang doesn't support, but we don't
ASTConsumers.cpp
ASTMerge.cpp
ASTUnit.cpp
- CacheTokens.cpp
ChainedDiagnosticConsumer.cpp
ChainedIncludesSource.cpp
CodeGenOptions.cpp
+++ /dev/null
-//===--- CacheTokens.cpp - Caching of lexer tokens for PTH support --------===//
-//
-// The LLVM Compiler Infrastructure
-//
-// This file is distributed under the University of Illinois Open Source
-// License. See LICENSE.TXT for details.
-//
-//===----------------------------------------------------------------------===//
-//
-// This provides a possible implementation of PTH support for Clang that is
-// based on caching lexed tokens and identifiers.
-//
-//===----------------------------------------------------------------------===//
-
-#include "clang/Basic/Diagnostic.h"
-#include "clang/Basic/FileManager.h"
-#include "clang/Basic/FileSystemStatCache.h"
-#include "clang/Basic/IdentifierTable.h"
-#include "clang/Basic/SourceManager.h"
-#include "clang/Frontend/Utils.h"
-#include "clang/Lex/Lexer.h"
-#include "clang/Lex/PTHManager.h"
-#include "clang/Lex/Preprocessor.h"
-#include "llvm/ADT/StringMap.h"
-#include "llvm/Support/DJB.h"
-#include "llvm/Support/EndianStream.h"
-#include "llvm/Support/FileSystem.h"
-#include "llvm/Support/MemoryBuffer.h"
-#include "llvm/Support/OnDiskHashTable.h"
-#include "llvm/Support/Path.h"
-
-// FIXME: put this somewhere else?
-#ifndef S_ISDIR
-#define S_ISDIR(x) (((x)&_S_IFDIR)!=0)
-#endif
-
-using namespace clang;
-
-//===----------------------------------------------------------------------===//
-// PTH-specific stuff.
-//===----------------------------------------------------------------------===//
-
-typedef uint32_t Offset;
-
-namespace {
-class PTHEntry {
- Offset TokenData, PPCondData;
-
-public:
- PTHEntry() {}
-
- PTHEntry(Offset td, Offset ppcd)
- : TokenData(td), PPCondData(ppcd) {}
-
- Offset getTokenOffset() const { return TokenData; }
- Offset getPPCondTableOffset() const { return PPCondData; }
-};
-
-
-class PTHEntryKeyVariant {
- union {
- const FileEntry *FE;
- // FIXME: Use "StringRef Path;" when MSVC 2013 is dropped.
- const char *PathPtr;
- };
- size_t PathSize;
- enum { IsFE = 0x1, IsDE = 0x2, IsNoExist = 0x0 } Kind;
- FileData *Data;
-
-public:
- PTHEntryKeyVariant(const FileEntry *fe) : FE(fe), Kind(IsFE), Data(nullptr) {}
-
- PTHEntryKeyVariant(FileData *Data, StringRef Path)
- : PathPtr(Path.data()), PathSize(Path.size()), Kind(IsDE),
- Data(new FileData(*Data)) {}
-
- explicit PTHEntryKeyVariant(StringRef Path)
- : PathPtr(Path.data()), PathSize(Path.size()), Kind(IsNoExist),
- Data(nullptr) {}
-
- bool isFile() const { return Kind == IsFE; }
-
- StringRef getString() const {
- return Kind == IsFE ? FE->getName() : StringRef(PathPtr, PathSize);
- }
-
- unsigned getKind() const { return (unsigned) Kind; }
-
- void EmitData(raw_ostream& Out) {
- using namespace llvm::support;
- endian::Writer LE(Out, little);
- switch (Kind) {
- case IsFE: {
- // Emit stat information.
- llvm::sys::fs::UniqueID UID = FE->getUniqueID();
- LE.write<uint64_t>(UID.getFile());
- LE.write<uint64_t>(UID.getDevice());
- LE.write<uint64_t>(FE->getModificationTime());
- LE.write<uint64_t>(FE->getSize());
- } break;
- case IsDE:
- // Emit stat information.
- LE.write<uint64_t>(Data->UniqueID.getFile());
- LE.write<uint64_t>(Data->UniqueID.getDevice());
- LE.write<uint64_t>(Data->ModTime);
- LE.write<uint64_t>(Data->Size);
- delete Data;
- break;
- default:
- break;
- }
- }
-
- unsigned getRepresentationLength() const {
- return Kind == IsNoExist ? 0 : 4 * 8;
- }
-};
-
-class FileEntryPTHEntryInfo {
-public:
- typedef PTHEntryKeyVariant key_type;
- typedef key_type key_type_ref;
-
- typedef PTHEntry data_type;
- typedef const PTHEntry& data_type_ref;
-
- typedef unsigned hash_value_type;
- typedef unsigned offset_type;
-
- static hash_value_type ComputeHash(PTHEntryKeyVariant V) {
- return llvm::djbHash(V.getString());
- }
-
- static std::pair<unsigned,unsigned>
- EmitKeyDataLength(raw_ostream& Out, PTHEntryKeyVariant V,
- const PTHEntry& E) {
- using namespace llvm::support;
- endian::Writer LE(Out, little);
-
- unsigned n = V.getString().size() + 1 + 1;
- LE.write<uint16_t>(n);
-
- unsigned m = V.getRepresentationLength() + (V.isFile() ? 4 + 4 : 0);
- LE.write<uint8_t>(m);
-
- return std::make_pair(n, m);
- }
-
- static void EmitKey(raw_ostream& Out, PTHEntryKeyVariant V, unsigned n){
- using namespace llvm::support;
- // Emit the entry kind.
- Out << char(V.getKind());
- // Emit the string.
- Out.write(V.getString().data(), n - 1);
- }
-
- static void EmitData(raw_ostream& Out, PTHEntryKeyVariant V,
- const PTHEntry& E, unsigned) {
- using namespace llvm::support;
- endian::Writer LE(Out, little);
-
- // For file entries emit the offsets into the PTH file for token data
- // and the preprocessor blocks table.
- if (V.isFile()) {
- LE.write<uint32_t>(E.getTokenOffset());
- LE.write<uint32_t>(E.getPPCondTableOffset());
- }
-
- // Emit any other data associated with the key (i.e., stat information).
- V.EmitData(Out);
- }
-};
-
-class OffsetOpt {
- bool valid;
- Offset off;
-public:
- OffsetOpt() : valid(false) {}
- bool hasOffset() const { return valid; }
- Offset getOffset() const { assert(valid); return off; }
- void setOffset(Offset o) { off = o; valid = true; }
-};
-} // end anonymous namespace
-
-typedef llvm::OnDiskChainedHashTableGenerator<FileEntryPTHEntryInfo> PTHMap;
-
-namespace {
-class PTHWriter {
- typedef llvm::DenseMap<const IdentifierInfo*,uint32_t> IDMap;
- typedef llvm::StringMap<OffsetOpt, llvm::BumpPtrAllocator> CachedStrsTy;
-
- raw_pwrite_stream &Out;
- Preprocessor& PP;
- IDMap IM;
- std::vector<llvm::StringMapEntry<OffsetOpt>*> StrEntries;
- PTHMap PM;
- CachedStrsTy CachedStrs;
- uint32_t idcount;
- Offset CurStrOffset;
-
- //// Get the persistent id for the given IdentifierInfo*.
- uint32_t ResolveID(const IdentifierInfo* II);
-
- /// Emit a token to the PTH file.
- void EmitToken(const Token& T);
-
- void Emit8(uint32_t V) {
- Out << char(V);
- }
-
- void Emit16(uint32_t V) {
- using namespace llvm::support;
- endian::write<uint16_t>(Out, V, little);
- }
-
- void Emit32(uint32_t V) {
- using namespace llvm::support;
- endian::write<uint32_t>(Out, V, little);
- }
-
- void EmitBuf(const char *Ptr, unsigned NumBytes) {
- Out.write(Ptr, NumBytes);
- }
-
- void EmitString(StringRef V) {
- using namespace llvm::support;
- endian::write<uint16_t>(Out, V.size(), little);
- EmitBuf(V.data(), V.size());
- }
-
- /// EmitIdentifierTable - Emits two tables to the PTH file. The first is
- /// a hashtable mapping from identifier strings to persistent IDs.
- /// The second is a straight table mapping from persistent IDs to string data
- /// (the keys of the first table).
- std::pair<Offset, Offset> EmitIdentifierTable();
-
- /// EmitFileTable - Emit a table mapping from file name strings to PTH
- /// token data.
- Offset EmitFileTable() { return PM.Emit(Out); }
-
- PTHEntry LexTokens(Lexer& L);
- Offset EmitCachedSpellings();
-
-public:
- PTHWriter(raw_pwrite_stream &out, Preprocessor &pp)
- : Out(out), PP(pp), idcount(0), CurStrOffset(0) {}
-
- PTHMap &getPM() { return PM; }
- void GeneratePTH(StringRef MainFile);
-};
-} // end anonymous namespace
-
-uint32_t PTHWriter::ResolveID(const IdentifierInfo* II) {
- // Null IdentifierInfo's map to the persistent ID 0.
- if (!II)
- return 0;
-
- IDMap::iterator I = IM.find(II);
- if (I != IM.end())
- return I->second; // We've already added 1.
-
- IM[II] = ++idcount; // Pre-increment since '0' is reserved for NULL.
- return idcount;
-}
-
-void PTHWriter::EmitToken(const Token& T) {
- // Emit the token kind, flags, and length.
- Emit32(((uint32_t) T.getKind()) | ((((uint32_t) T.getFlags())) << 8)|
- (((uint32_t) T.getLength()) << 16));
-
- if (!T.isLiteral()) {
- Emit32(ResolveID(T.getIdentifierInfo()));
- } else {
- // We cache *un-cleaned* spellings. This gives us 100% fidelity with the
- // source code.
- StringRef s(T.getLiteralData(), T.getLength());
-
- // Get the string entry.
- auto &E = *CachedStrs.insert(std::make_pair(s, OffsetOpt())).first;
-
- // If this is a new string entry, bump the PTH offset.
- if (!E.second.hasOffset()) {
- E.second.setOffset(CurStrOffset);
- StrEntries.push_back(&E);
- CurStrOffset += s.size() + 1;
- }
-
- // Emit the relative offset into the PTH file for the spelling string.
- Emit32(E.second.getOffset());
- }
-
- // Emit the offset into the original source file of this token so that we
- // can reconstruct its SourceLocation.
- Emit32(PP.getSourceManager().getFileOffset(T.getLocation()));
-}
-
-PTHEntry PTHWriter::LexTokens(Lexer& L) {
- // Pad 0's so that we emit tokens to a 4-byte alignment.
- // This speed up reading them back in.
- using namespace llvm::support;
- endian::Writer LE(Out, little);
- uint32_t TokenOff = Out.tell();
- for (uint64_t N = llvm::OffsetToAlignment(TokenOff, 4); N; --N, ++TokenOff)
- LE.write<uint8_t>(0);
-
- // Keep track of matching '#if' ... '#endif'.
- typedef std::vector<std::pair<Offset, unsigned> > PPCondTable;
- PPCondTable PPCond;
- std::vector<unsigned> PPStartCond;
- bool ParsingPreprocessorDirective = false;
- Token Tok;
-
- do {
- L.LexFromRawLexer(Tok);
- NextToken:
-
- if ((Tok.isAtStartOfLine() || Tok.is(tok::eof)) &&
- ParsingPreprocessorDirective) {
- // Insert an eod token into the token cache. It has the same
- // position as the next token that is not on the same line as the
- // preprocessor directive. Observe that we continue processing
- // 'Tok' when we exit this branch.
- Token Tmp = Tok;
- Tmp.setKind(tok::eod);
- Tmp.clearFlag(Token::StartOfLine);
- Tmp.setIdentifierInfo(nullptr);
- EmitToken(Tmp);
- ParsingPreprocessorDirective = false;
- }
-
- if (Tok.is(tok::raw_identifier)) {
- PP.LookUpIdentifierInfo(Tok);
- EmitToken(Tok);
- continue;
- }
-
- if (Tok.is(tok::hash) && Tok.isAtStartOfLine()) {
- // Special processing for #include. Store the '#' token and lex
- // the next token.
- assert(!ParsingPreprocessorDirective);
- Offset HashOff = (Offset) Out.tell();
-
- // Get the next token.
- Token NextTok;
- L.LexFromRawLexer(NextTok);
-
- // If we see the start of line, then we had a null directive "#". In
- // this case, discard both tokens.
- if (NextTok.isAtStartOfLine())
- goto NextToken;
-
- // The token is the start of a directive. Emit it.
- EmitToken(Tok);
- Tok = NextTok;
-
- // Did we see 'include'/'import'/'include_next'?
- if (Tok.isNot(tok::raw_identifier)) {
- EmitToken(Tok);
- continue;
- }
-
- IdentifierInfo* II = PP.LookUpIdentifierInfo(Tok);
- tok::PPKeywordKind K = II->getPPKeywordID();
-
- ParsingPreprocessorDirective = true;
-
- switch (K) {
- case tok::pp_not_keyword:
- // Invalid directives "#foo" can occur in #if 0 blocks etc, just pass
- // them through.
- default:
- break;
-
- case tok::pp_include:
- case tok::pp_import:
- case tok::pp_include_next: {
- // Save the 'include' token.
- EmitToken(Tok);
- // Lex the next token as an include string.
- L.setParsingPreprocessorDirective(true);
- L.LexIncludeFilename(Tok);
- L.setParsingPreprocessorDirective(false);
- assert(!Tok.isAtStartOfLine());
- if (Tok.is(tok::raw_identifier))
- PP.LookUpIdentifierInfo(Tok);
-
- break;
- }
- case tok::pp_if:
- case tok::pp_ifdef:
- case tok::pp_ifndef: {
- // Add an entry for '#if' and friends. We initially set the target
- // index to 0. This will get backpatched when we hit #endif.
- PPStartCond.push_back(PPCond.size());
- PPCond.push_back(std::make_pair(HashOff, 0U));
- break;
- }
- case tok::pp_endif: {
- // Add an entry for '#endif'. We set the target table index to itself.
- // This will later be set to zero when emitting to the PTH file. We
- // use 0 for uninitialized indices because that is easier to debug.
- unsigned index = PPCond.size();
- // Backpatch the opening '#if' entry.
- assert(!PPStartCond.empty());
- assert(PPCond.size() > PPStartCond.back());
- assert(PPCond[PPStartCond.back()].second == 0);
- PPCond[PPStartCond.back()].second = index;
- PPStartCond.pop_back();
- // Add the new entry to PPCond.
- PPCond.push_back(std::make_pair(HashOff, index));
- EmitToken(Tok);
-
- // Some files have gibberish on the same line as '#endif'.
- // Discard these tokens.
- do
- L.LexFromRawLexer(Tok);
- while (Tok.isNot(tok::eof) && !Tok.isAtStartOfLine());
- // We have the next token in hand.
- // Don't immediately lex the next one.
- goto NextToken;
- }
- case tok::pp_elif:
- case tok::pp_else: {
- // Add an entry for #elif or #else.
- // This serves as both a closing and opening of a conditional block.
- // This means that its entry will get backpatched later.
- unsigned index = PPCond.size();
- // Backpatch the previous '#if' entry.
- assert(!PPStartCond.empty());
- assert(PPCond.size() > PPStartCond.back());
- assert(PPCond[PPStartCond.back()].second == 0);
- PPCond[PPStartCond.back()].second = index;
- PPStartCond.pop_back();
- // Now add '#elif' as a new block opening.
- PPCond.push_back(std::make_pair(HashOff, 0U));
- PPStartCond.push_back(index);
- break;
- }
- }
- }
-
- EmitToken(Tok);
- }
- while (Tok.isNot(tok::eof));
-
- assert(PPStartCond.empty() && "Error: imblanced preprocessor conditionals.");
-
- // Next write out PPCond.
- Offset PPCondOff = (Offset) Out.tell();
-
- // Write out the size of PPCond so that clients can identifer empty tables.
- Emit32(PPCond.size());
-
- for (unsigned i = 0, e = PPCond.size(); i!=e; ++i) {
- Emit32(PPCond[i].first - TokenOff);
- uint32_t x = PPCond[i].second;
- assert(x != 0 && "PPCond entry not backpatched.");
- // Emit zero for #endifs. This allows us to do checking when
- // we read the PTH file back in.
- Emit32(x == i ? 0 : x);
- }
-
- return PTHEntry(TokenOff, PPCondOff);
-}
-
-Offset PTHWriter::EmitCachedSpellings() {
- // Write each cached strings to the PTH file.
- Offset SpellingsOff = Out.tell();
-
- for (std::vector<llvm::StringMapEntry<OffsetOpt>*>::iterator
- I = StrEntries.begin(), E = StrEntries.end(); I!=E; ++I)
- EmitBuf((*I)->getKeyData(), (*I)->getKeyLength()+1 /*nul included*/);
-
- return SpellingsOff;
-}
-
-static uint32_t swap32le(uint32_t X) {
- return llvm::support::endian::byte_swap<uint32_t, llvm::support::little>(X);
-}
-
-static void pwrite32le(raw_pwrite_stream &OS, uint32_t Val, uint64_t &Off) {
- uint32_t LEVal = swap32le(Val);
- OS.pwrite(reinterpret_cast<const char *>(&LEVal), 4, Off);
- Off += 4;
-}
-
-void PTHWriter::GeneratePTH(StringRef MainFile) {
- // Generate the prologue.
- Out << "cfe-pth" << '\0';
- Emit32(PTHManager::Version);
-
- // Leave 4 words for the prologue.
- Offset PrologueOffset = Out.tell();
- for (unsigned i = 0; i < 4; ++i)
- Emit32(0);
-
- // Write the name of the MainFile.
- if (!MainFile.empty()) {
- EmitString(MainFile);
- } else {
- // String with 0 bytes.
- Emit16(0);
- }
- Emit8(0);
-
- // Iterate over all the files in SourceManager. Create a lexer
- // for each file and cache the tokens.
- SourceManager &SM = PP.getSourceManager();
- const LangOptions &LOpts = PP.getLangOpts();
-
- for (SourceManager::fileinfo_iterator I = SM.fileinfo_begin(),
- E = SM.fileinfo_end(); I != E; ++I) {
- const SrcMgr::ContentCache &C = *I->second;
- const FileEntry *FE = C.OrigEntry;
-
- // FIXME: Handle files with non-absolute paths.
- if (llvm::sys::path::is_relative(FE->getName()))
- continue;
-
- const llvm::MemoryBuffer *B = C.getBuffer(PP.getDiagnostics(), SM);
- if (!B) continue;
-
- FileID FID = SM.createFileID(FE, SourceLocation(), SrcMgr::C_User);
- const llvm::MemoryBuffer *FromFile = SM.getBuffer(FID);
- Lexer L(FID, FromFile, SM, LOpts);
- PM.insert(FE, LexTokens(L));
- }
-
- // Write out the identifier table.
- const std::pair<Offset,Offset> &IdTableOff = EmitIdentifierTable();
-
- // Write out the cached strings table.
- Offset SpellingOff = EmitCachedSpellings();
-
- // Write out the file table.
- Offset FileTableOff = EmitFileTable();
-
- // Finally, write the prologue.
- uint64_t Off = PrologueOffset;
- pwrite32le(Out, IdTableOff.first, Off);
- pwrite32le(Out, IdTableOff.second, Off);
- pwrite32le(Out, FileTableOff, Off);
- pwrite32le(Out, SpellingOff, Off);
-}
-
-namespace {
-/// StatListener - A simple "interpose" object used to monitor stat calls
-/// invoked by FileManager while processing the original sources used
-/// as input to PTH generation. StatListener populates the PTHWriter's
-/// file map with stat information for directories as well as negative stats.
-/// Stat information for files are populated elsewhere.
-class StatListener : public FileSystemStatCache {
- PTHMap &PM;
-public:
- StatListener(PTHMap &pm) : PM(pm) {}
- ~StatListener() override {}
-
- LookupResult getStat(StringRef Path, FileData &Data, bool isFile,
- std::unique_ptr<llvm::vfs::File> *F,
- llvm::vfs::FileSystem &FS) override {
- LookupResult Result = statChained(Path, Data, isFile, F, FS);
-
- if (Result == CacheMissing) // Failed 'stat'.
- PM.insert(PTHEntryKeyVariant(Path), PTHEntry());
- else if (Data.IsDirectory) {
- // Only cache directories with absolute paths.
- if (llvm::sys::path::is_relative(Path))
- return Result;
-
- PM.insert(PTHEntryKeyVariant(&Data, Path), PTHEntry());
- }
-
- return Result;
- }
-};
-} // end anonymous namespace
-
-void clang::CacheTokens(Preprocessor &PP, raw_pwrite_stream *OS) {
- // Get the name of the main file.
- const SourceManager &SrcMgr = PP.getSourceManager();
- const FileEntry *MainFile = SrcMgr.getFileEntryForID(SrcMgr.getMainFileID());
- SmallString<128> MainFilePath(MainFile->getName());
-
- llvm::sys::fs::make_absolute(MainFilePath);
-
- // Create the PTHWriter.
- PTHWriter PW(*OS, PP);
-
- // Install the 'stat' system call listener in the FileManager.
- auto StatCacheOwner = llvm::make_unique<StatListener>(PW.getPM());
- StatListener *StatCache = StatCacheOwner.get();
- PP.getFileManager().addStatCache(std::move(StatCacheOwner),
- /*AtBeginning=*/true);
-
- // Lex through the entire file. This will populate SourceManager with
- // all of the header information.
- Token Tok;
- PP.EnterMainSourceFile();
- do { PP.Lex(Tok); } while (Tok.isNot(tok::eof));
-
- // Generate the PTH file.
- PP.getFileManager().removeStatCache(StatCache);
- PW.GeneratePTH(MainFilePath.str());
-}
-
-//===----------------------------------------------------------------------===//
-
-namespace {
-class PTHIdKey {
-public:
- const IdentifierInfo* II;
- uint32_t FileOffset;
-};
-
-class PTHIdentifierTableTrait {
-public:
- typedef PTHIdKey* key_type;
- typedef key_type key_type_ref;
-
- typedef uint32_t data_type;
- typedef data_type data_type_ref;
-
- typedef unsigned hash_value_type;
- typedef unsigned offset_type;
-
- static hash_value_type ComputeHash(PTHIdKey* key) {
- return llvm::djbHash(key->II->getName());
- }
-
- static std::pair<unsigned,unsigned>
- EmitKeyDataLength(raw_ostream& Out, const PTHIdKey* key, uint32_t) {
- using namespace llvm::support;
- unsigned n = key->II->getLength() + 1;
- endian::write<uint16_t>(Out, n, little);
- return std::make_pair(n, sizeof(uint32_t));
- }
-
- static void EmitKey(raw_ostream& Out, PTHIdKey* key, unsigned n) {
- // Record the location of the key data. This is used when generating
- // the mapping from persistent IDs to strings.
- key->FileOffset = Out.tell();
- Out.write(key->II->getNameStart(), n);
- }
-
- static void EmitData(raw_ostream& Out, PTHIdKey*, uint32_t pID,
- unsigned) {
- using namespace llvm::support;
- endian::write<uint32_t>(Out, pID, little);
- }
-};
-} // end anonymous namespace
-
-/// EmitIdentifierTable - Emits two tables to the PTH file. The first is
-/// a hashtable mapping from identifier strings to persistent IDs. The second
-/// is a straight table mapping from persistent IDs to string data (the
-/// keys of the first table).
-///
-std::pair<Offset,Offset> PTHWriter::EmitIdentifierTable() {
- // Build two maps:
- // (1) an inverse map from persistent IDs -> (IdentifierInfo*,Offset)
- // (2) a map from (IdentifierInfo*, Offset)* -> persistent IDs
-
- // Note that we use 'calloc', so all the bytes are 0.
- PTHIdKey *IIDMap = static_cast<PTHIdKey*>(
- llvm::safe_calloc(idcount, sizeof(PTHIdKey)));
-
- // Create the hashtable.
- llvm::OnDiskChainedHashTableGenerator<PTHIdentifierTableTrait> IIOffMap;
-
- // Generate mapping from persistent IDs -> IdentifierInfo*.
- for (IDMap::iterator I = IM.begin(), E = IM.end(); I != E; ++I) {
- // Decrement by 1 because we are using a vector for the lookup and
- // 0 is reserved for NULL.
- assert(I->second > 0);
- assert(I->second-1 < idcount);
- unsigned idx = I->second-1;
-
- // Store the mapping from persistent ID to IdentifierInfo*
- IIDMap[idx].II = I->first;
-
- // Store the reverse mapping in a hashtable.
- IIOffMap.insert(&IIDMap[idx], I->second);
- }
-
- // Write out the inverse map first. This causes the PCIDKey entries to
- // record PTH file offsets for the string data. This is used to write
- // the second table.
- Offset StringTableOffset = IIOffMap.Emit(Out);
-
- // Now emit the table mapping from persistent IDs to PTH file offsets.
- Offset IDOff = Out.tell();
- Emit32(idcount); // Emit the number of identifiers.
- for (unsigned i = 0 ; i < idcount; ++i)
- Emit32(IIDMap[i].FileOffset);
-
- // Finally, release the inverse map.
- free(IIDMap);
-
- return std::make_pair(IDOff, StringTableOffset);
-}
CInvok->getPreprocessorOpts().ChainedIncludes.clear();
CInvok->getPreprocessorOpts().ImplicitPCHInclude.clear();
- CInvok->getPreprocessorOpts().ImplicitPTHInclude.clear();
CInvok->getPreprocessorOpts().DisablePCHValidation = true;
CInvok->getPreprocessorOpts().Includes.clear();
CInvok->getPreprocessorOpts().MacroIncludes.clear();
#include "clang/Frontend/Utils.h"
#include "clang/Frontend/VerifyDiagnosticConsumer.h"
#include "clang/Lex/HeaderSearch.h"
-#include "clang/Lex/PTHManager.h"
#include "clang/Lex/Preprocessor.h"
#include "clang/Lex/PreprocessorOptions.h"
#include "clang/Sema/CodeCompleteConsumer.h"
// The module manager holds a reference to the old preprocessor (if any).
ModuleManager.reset();
- // Create a PTH manager if we are using some form of a token cache.
- PTHManager *PTHMgr = nullptr;
- if (!PPOpts.TokenCache.empty())
- PTHMgr = PTHManager::Create(PPOpts.TokenCache, getDiagnostics());
-
// Create the Preprocessor.
HeaderSearch *HeaderInfo =
new HeaderSearch(getHeaderSearchOptsPtr(), getSourceManager(),
getDiagnostics(), getLangOpts(), &getTarget());
PP = std::make_shared<Preprocessor>(
Invocation->getPreprocessorOptsPtr(), getDiagnostics(), getLangOpts(),
- getSourceManager(), getPCMCache(), *HeaderInfo, *this, PTHMgr,
+ getSourceManager(), getPCMCache(), *HeaderInfo, *this,
+ /*IdentifierInfoLookup=*/nullptr,
/*OwnsHeaderSearch=*/true, TUKind);
getTarget().adjust(getLangOpts());
PP->Initialize(getTarget(), getAuxTarget());
- // Note that this is different then passing PTHMgr to Preprocessor's ctor.
- // That argument is used as the IdentifierInfoLookup argument to
- // IdentifierTable's ctor.
- if (PTHMgr) {
- PTHMgr->setPreprocessor(&*PP);
- PP->setPTHManager(PTHMgr);
- }
-
if (PPOpts.DetailedRecord)
PP->createPreprocessingRecord();
Opts.ProgramAction = frontend::GenerateHeaderModule; break;
case OPT_emit_pch:
Opts.ProgramAction = frontend::GeneratePCH; break;
- case OPT_emit_pth:
- Opts.ProgramAction = frontend::GeneratePTH; break;
case OPT_init_only:
Opts.ProgramAction = frontend::InitOnly; break;
case OPT_fsyntax_only:
case frontend::GenerateModuleInterface:
case frontend::GenerateHeaderModule:
case frontend::GeneratePCH:
- case frontend::GeneratePTH:
case frontend::ParseSyntaxOnly:
case frontend::ModuleFileInfo:
case frontend::VerifyPCH:
DiagnosticsEngine &Diags,
frontend::ActionKind Action) {
Opts.ImplicitPCHInclude = Args.getLastArgValue(OPT_include_pch);
- Opts.ImplicitPTHInclude = Args.getLastArgValue(OPT_include_pth);
Opts.PCHWithHdrStop = Args.hasArg(OPT_pch_through_hdrstop_create) ||
Args.hasArg(OPT_pch_through_hdrstop_use);
Opts.PCHWithHdrStopCreate = Args.hasArg(OPT_pch_through_hdrstop_create);
Opts.PCHThroughHeader = Args.getLastArgValue(OPT_pch_through_header_EQ);
- if (const Arg *A = Args.getLastArg(OPT_token_cache))
- Opts.TokenCache = A->getValue();
- else
- Opts.TokenCache = Opts.ImplicitPTHInclude;
Opts.UsePredefines = !Args.hasArg(OPT_undef);
Opts.DetailedRecord = Args.hasArg(OPT_detailed_preprocessing_record);
Opts.DisablePCHValidation = Args.hasArg(OPT_fno_validate_pch);
} while (Tok.isNot(tok::eof));
}
-void GeneratePTHAction::ExecuteAction() {
- CompilerInstance &CI = getCompilerInstance();
- std::unique_ptr<raw_pwrite_stream> OS =
- CI.createDefaultOutputFile(true, getCurrentFile());
- if (!OS)
- return;
-
- CacheTokens(CI.getPreprocessor(), OS.get());
-}
-
void PreprocessOnlyAction::ExecuteAction() {
Preprocessor &PP = getCompilerInstance().getPreprocessor();
#include "clang/Frontend/FrontendOptions.h"
#include "clang/Frontend/Utils.h"
#include "clang/Lex/HeaderSearch.h"
-#include "clang/Lex/PTHManager.h"
#include "clang/Lex/Preprocessor.h"
#include "clang/Lex/PreprocessorOptions.h"
#include "clang/Serialization/ASTReader.h"
Builder.append("##"); // ##?
}
-/// AddImplicitIncludePTH - Add an implicit \#include using the original file
-/// used to generate a PTH cache.
-static void AddImplicitIncludePTH(MacroBuilder &Builder, Preprocessor &PP,
- StringRef ImplicitIncludePTH) {
- PTHManager *P = PP.getPTHManager();
- // Null check 'P' in the corner case where it couldn't be created.
- const char *OriginalFile = P ? P->getOriginalSourceFile() : nullptr;
-
- if (!OriginalFile) {
- PP.getDiagnostics().Report(diag::err_fe_pth_file_has_no_source_header)
- << ImplicitIncludePTH;
- return;
- }
-
- AddImplicitInclude(Builder, OriginalFile);
-}
-
/// Add an implicit \#include using the original file used to generate
/// a PCH file.
static void AddImplicitIncludePCH(MacroBuilder &Builder, Preprocessor &PP,
if (!InitOpts.ImplicitPCHInclude.empty())
AddImplicitIncludePCH(Builder, PP, PCHContainerRdr,
InitOpts.ImplicitPCHInclude);
- if (!InitOpts.ImplicitPTHInclude.empty())
- AddImplicitIncludePTH(Builder, PP, InitOpts.ImplicitPTHInclude);
// Process -include directives.
for (unsigned i = 0, e = InitOpts.Includes.size(); i != e; ++i) {
case GenerateHeaderModule:
return llvm::make_unique<GenerateHeaderModuleAction>();
case GeneratePCH: return llvm::make_unique<GeneratePCHAction>();
- case GeneratePTH: return llvm::make_unique<GeneratePTHAction>();
case InitOnly: return llvm::make_unique<InitOnlyAction>();
case ParseSyntaxOnly: return llvm::make_unique<SyntaxOnlyAction>();
case ModuleFileInfo: return llvm::make_unique<DumpModuleInfoAction>();
PPExpressions.cpp
PPLexerChange.cpp
PPMacroExpansion.cpp
- PTHLexer.cpp
Pragma.cpp
PreprocessingRecord.cpp
Preprocessor.cpp
#include "clang/Lex/Pragma.h"
#include "clang/Lex/Preprocessor.h"
#include "clang/Lex/PreprocessorOptions.h"
-#include "clang/Lex/PTHLexer.h"
#include "clang/Lex/Token.h"
#include "clang/Lex/VariadicMacroSupport.h"
#include "llvm/ADT/ArrayRef.h"
CurPPLexer->pushConditionalLevel(IfTokenLoc, /*isSkipping*/ false,
FoundNonSkipPortion, FoundElse);
- if (CurPTHLexer) {
- PTHSkipExcludedConditionalBlock();
- return;
- }
-
// Enter raw mode to disable identifier lookup (and thus macro expansion),
// disabling warnings, etc.
CurPPLexer->LexingRawMode = true;
Tok.getLocation());
}
-void Preprocessor::PTHSkipExcludedConditionalBlock() {
- while (true) {
- assert(CurPTHLexer);
- assert(CurPTHLexer->LexingRawMode == false);
-
- // Skip to the next '#else', '#elif', or #endif.
- if (CurPTHLexer->SkipBlock()) {
- // We have reached an #endif. Both the '#' and 'endif' tokens
- // have been consumed by the PTHLexer. Just pop off the condition level.
- PPConditionalInfo CondInfo;
- bool InCond = CurPTHLexer->popConditionalLevel(CondInfo);
- (void)InCond; // Silence warning in no-asserts mode.
- assert(!InCond && "Can't be skipping if not in a conditional!");
- break;
- }
-
- // We have reached a '#else' or '#elif'. Lex the next token to get
- // the directive flavor.
- Token Tok;
- LexUnexpandedToken(Tok);
-
- // We can actually look up the IdentifierInfo here since we aren't in
- // raw mode.
- tok::PPKeywordKind K = Tok.getIdentifierInfo()->getPPKeywordID();
-
- if (K == tok::pp_else) {
- // #else: Enter the else condition. We aren't in a nested condition
- // since we skip those. We're always in the one matching the last
- // blocked we skipped.
- PPConditionalInfo &CondInfo = CurPTHLexer->peekConditionalLevel();
- // Note that we've seen a #else in this conditional.
- CondInfo.FoundElse = true;
-
- // If the #if block wasn't entered then enter the #else block now.
- if (!CondInfo.FoundNonSkip) {
- CondInfo.FoundNonSkip = true;
-
- // Scan until the eod token.
- CurPTHLexer->ParsingPreprocessorDirective = true;
- DiscardUntilEndOfDirective();
- CurPTHLexer->ParsingPreprocessorDirective = false;
-
- break;
- }
-
- // Otherwise skip this block.
- continue;
- }
-
- assert(K == tok::pp_elif);
- PPConditionalInfo &CondInfo = CurPTHLexer->peekConditionalLevel();
-
- // If this is a #elif with a #else before it, report the error.
- if (CondInfo.FoundElse)
- Diag(Tok, diag::pp_err_elif_after_else);
-
- // If this is in a skipping block or if we're already handled this #if
- // block, don't bother parsing the condition. We just skip this block.
- if (CondInfo.FoundNonSkip)
- continue;
-
- // Evaluate the condition of the #elif.
- IdentifierInfo *IfNDefMacro = nullptr;
- CurPTHLexer->ParsingPreprocessorDirective = true;
- bool ShouldEnter = EvaluateDirectiveExpression(IfNDefMacro).Conditional;
- CurPTHLexer->ParsingPreprocessorDirective = false;
-
- // If this condition is true, enter it!
- if (ShouldEnter) {
- CondInfo.FoundNonSkip = true;
- break;
- }
-
- // Otherwise, skip this block and go to the next one.
- }
-}
-
Module *Preprocessor::getModuleForLocation(SourceLocation Loc) {
if (!SourceMgr.isInMainFile(Loc)) {
// Try to determine the module of the include directive.
///
void Preprocessor::HandleUserDiagnosticDirective(Token &Tok,
bool isWarning) {
- // PTH doesn't emit #warning or #error directives.
- if (CurPTHLexer)
- return CurPTHLexer->DiscardToEndOfLine();
-
// Read the rest of the line raw. We do this because we don't want macros
// to be expanded and we don't require that the tokens be valid preprocessing
// tokens. For example, this is allowed: "#warning ` 'foo". GCC does
if (hadModuleLoaderFatalFailure()) {
// With a fatal failure in the module loader, we abort parsing.
Token &Result = IncludeTok;
- if (CurLexer) {
- Result.startToken();
- CurLexer->FormTokenWithChars(Result, CurLexer->BufferEnd, tok::eof);
- CurLexer->cutOffLexing();
- } else {
- assert(CurPTHLexer && "#include but no current lexer set!");
- CurPTHLexer->getEOF(Result);
- }
+ assert(CurLexer && "#include but no current lexer set!");
+ Result.startToken();
+ CurLexer->FormTokenWithChars(Result, CurLexer->BufferEnd, tok::eof);
+ CurLexer->cutOffLexing();
}
return;
}
#include "clang/Lex/HeaderSearch.h"
#include "clang/Lex/LexDiagnostic.h"
#include "clang/Lex/MacroInfo.h"
-#include "clang/Lex/PTHManager.h"
#include "llvm/ADT/StringSwitch.h"
#include "llvm/Support/FileSystem.h"
#include "llvm/Support/MemoryBuffer.h"
if (MaxIncludeStackDepth < IncludeMacroStack.size())
MaxIncludeStackDepth = IncludeMacroStack.size();
- if (PTH) {
- if (PTHLexer *PL = PTH->CreateLexer(FID)) {
- EnterSourceFileWithPTH(PL, CurDir);
- return false;
- }
- }
-
// Get the MemoryBuffer for this FID, if it fails, we fail.
bool Invalid = false;
const llvm::MemoryBuffer *InputFile =
}
}
-/// EnterSourceFileWithPTH - Add a source file to the top of the include stack
-/// and start getting tokens from it using the PTH cache.
-void Preprocessor::EnterSourceFileWithPTH(PTHLexer *PL,
- const DirectoryLookup *CurDir) {
-
- if (CurPPLexer || CurTokenLexer)
- PushIncludeMacroStack();
-
- CurDirLookup = CurDir;
- CurPTHLexer.reset(PL);
- CurPPLexer = CurPTHLexer.get();
- CurLexerSubmodule = nullptr;
- if (CurLexerKind != CLK_LexAfterModuleImport)
- CurLexerKind = CLK_PTHLexer;
-
- // Notify the client, if desired, that we are in a new source file.
- if (Callbacks) {
- FileID FID = CurPPLexer->getFileID();
- SourceLocation EnterLoc = SourceMgr.getLocForStartOfFile(FID);
- SrcMgr::CharacteristicKind FileType =
- SourceMgr.getFileCharacteristic(EnterLoc);
- Callbacks->FileChanged(EnterLoc, PPCallbacks::EnterFile, FileType);
- }
-}
-
/// EnterMacro - Add a Macro to the top of the include stack and start lexing
/// tokens from it instead of the current buffer.
void Preprocessor::EnterMacro(Token &Tok, SourceLocation ILEnd,
// If we have an unclosed module region from a pragma at the end of a
// module, complain and close it now.
- // FIXME: This is not correct if we are building a module from PTH.
const bool LeavingSubmodule = CurLexer && CurLexerSubmodule;
if ((LeavingSubmodule || IncludeMacroStack.empty()) &&
!BuildingSubmoduleStack.empty() &&
if (isCodeCompletionEnabled() && CurPPLexer &&
SourceMgr.getLocForStartOfFile(CurPPLexer->getFileID()) ==
CodeCompletionFileLoc) {
- if (CurLexer) {
- Result.startToken();
- CurLexer->FormTokenWithChars(Result, CurLexer->BufferEnd, tok::eof);
- CurLexer.reset();
- } else {
- assert(CurPTHLexer && "Got EOF but no current lexer set!");
- CurPTHLexer->getEOF(Result);
- CurPTHLexer.reset();
- }
+ assert(CurLexer && "Got EOF but no current lexer set!");
+ Result.startToken();
+ CurLexer->FormTokenWithChars(Result, CurLexer->BufferEnd, tok::eof);
+ CurLexer.reset();
CurPPLexer = nullptr;
recomputeCurLexerKind();
}
// If this is the end of the main file, form an EOF token.
- if (CurLexer) {
- const char *EndPos = getCurLexerEndPos();
- Result.startToken();
- CurLexer->BufferPtr = EndPos;
- CurLexer->FormTokenWithChars(Result, EndPos, tok::eof);
-
- if (isCodeCompletionEnabled()) {
- // Inserting the code-completion point increases the source buffer by 1,
- // but the main FileID was created before inserting the point.
- // Compensate by reducing the EOF location by 1, otherwise the location
- // will point to the next FileID.
- // FIXME: This is hacky, the code-completion point should probably be
- // inserted before the main FileID is created.
- if (CurLexer->getFileLoc() == CodeCompletionFileLoc)
- Result.setLocation(Result.getLocation().getLocWithOffset(-1));
- }
-
- if (creatingPCHWithThroughHeader() && !LeavingPCHThroughHeader) {
- // Reached the end of the compilation without finding the through header.
- Diag(CurLexer->getFileLoc(), diag::err_pp_through_header_not_seen)
- << PPOpts->PCHThroughHeader << 0;
- }
+ assert(CurLexer && "Got EOF but no current lexer set!");
+ const char *EndPos = getCurLexerEndPos();
+ Result.startToken();
+ CurLexer->BufferPtr = EndPos;
+ CurLexer->FormTokenWithChars(Result, EndPos, tok::eof);
+
+ if (isCodeCompletionEnabled()) {
+ // Inserting the code-completion point increases the source buffer by 1,
+ // but the main FileID was created before inserting the point.
+ // Compensate by reducing the EOF location by 1, otherwise the location
+ // will point to the next FileID.
+ // FIXME: This is hacky, the code-completion point should probably be
+ // inserted before the main FileID is created.
+ if (CurLexer->getFileLoc() == CodeCompletionFileLoc)
+ Result.setLocation(Result.getLocation().getLocWithOffset(-1));
+ }
- if (!isIncrementalProcessingEnabled())
- // We're done with lexing.
- CurLexer.reset();
- } else {
- assert(CurPTHLexer && "Got EOF but no current lexer set!");
- CurPTHLexer->getEOF(Result);
- CurPTHLexer.reset();
+ if (creatingPCHWithThroughHeader() && !LeavingPCHThroughHeader) {
+ // Reached the end of the compilation without finding the through header.
+ Diag(CurLexer->getFileLoc(), diag::err_pp_through_header_not_seen)
+ << PPOpts->PCHThroughHeader << 0;
}
+ if (!isIncrementalProcessingEnabled())
+ // We're done with lexing.
+ CurLexer.reset();
+
if (!isIncrementalProcessingEnabled())
CurPPLexer = nullptr;
#include "clang/Lex/MacroInfo.h"
#include "clang/Lex/Preprocessor.h"
#include "clang/Lex/PreprocessorLexer.h"
-#include "clang/Lex/PTHLexer.h"
#include "clang/Lex/Token.h"
#include "llvm/ADT/ArrayRef.h"
#include "llvm/ADT/DenseMap.h"
unsigned Val;
if (CurLexer)
Val = CurLexer->isNextPPTokenLParen();
- else if (CurPTHLexer)
- Val = CurPTHLexer->isNextPPTokenLParen();
else
Val = CurTokenLexer->isNextTokenLParen();
for (const IncludeStackInfo &Entry : llvm::reverse(IncludeMacroStack)) {
if (Entry.TheLexer)
Val = Entry.TheLexer->isNextPPTokenLParen();
- else if (Entry.ThePTHLexer)
- Val = Entry.ThePTHLexer->isNextPPTokenLParen();
else
Val = Entry.TheTokenLexer->isNextTokenLParen();
+++ /dev/null
-//===- PTHLexer.cpp - Lex from a token stream -----------------------------===//
-//
-// The LLVM Compiler Infrastructure
-//
-// This file is distributed under the University of Illinois Open Source
-// License. See LICENSE.TXT for details.
-//
-//===----------------------------------------------------------------------===//
-//
-// This file implements the PTHLexer interface.
-//
-//===----------------------------------------------------------------------===//
-
-#include "clang/Lex/PTHLexer.h"
-#include "clang/Basic/Diagnostic.h"
-#include "clang/Basic/FileManager.h"
-#include "clang/Basic/FileSystemStatCache.h"
-#include "clang/Basic/IdentifierTable.h"
-#include "clang/Basic/SourceManager.h"
-#include "clang/Basic/TokenKinds.h"
-#include "clang/Lex/LexDiagnostic.h"
-#include "clang/Lex/PTHManager.h"
-#include "clang/Lex/Preprocessor.h"
-#include "clang/Lex/Token.h"
-#include "llvm/ADT/STLExtras.h"
-#include "llvm/ADT/StringRef.h"
-#include "llvm/Support/DJB.h"
-#include "llvm/Support/Endian.h"
-#include "llvm/Support/ErrorOr.h"
-#include "llvm/Support/FileSystem.h"
-#include "llvm/Support/MemoryBuffer.h"
-#include "llvm/Support/OnDiskHashTable.h"
-#include <cassert>
-#include <cstdint>
-#include <cstdlib>
-#include <cstring>
-#include <ctime>
-#include <memory>
-#include <utility>
-
-using namespace clang;
-
-static const unsigned StoredTokenSize = 1 + 1 + 2 + 4 + 4;
-
-//===----------------------------------------------------------------------===//
-// PTHLexer methods.
-//===----------------------------------------------------------------------===//
-
-PTHLexer::PTHLexer(Preprocessor &PP, FileID FID, const unsigned char *D,
- const unsigned char *ppcond, PTHManager &PM)
- : PreprocessorLexer(&PP, FID), TokBuf(D), CurPtr(D), PPCond(ppcond),
- CurPPCondPtr(ppcond), PTHMgr(PM) {
- FileStartLoc = PP.getSourceManager().getLocForStartOfFile(FID);
-}
-
-bool PTHLexer::Lex(Token& Tok) {
- //===--------------------------------------==//
- // Read the raw token data.
- //===--------------------------------------==//
- using namespace llvm::support;
-
- // Shadow CurPtr into an automatic variable.
- const unsigned char *CurPtrShadow = CurPtr;
-
- // Read in the data for the token.
- unsigned Word0 = endian::readNext<uint32_t, little, aligned>(CurPtrShadow);
- uint32_t IdentifierID =
- endian::readNext<uint32_t, little, aligned>(CurPtrShadow);
- uint32_t FileOffset =
- endian::readNext<uint32_t, little, aligned>(CurPtrShadow);
-
- tok::TokenKind TKind = (tok::TokenKind) (Word0 & 0xFF);
- Token::TokenFlags TFlags = (Token::TokenFlags) ((Word0 >> 8) & 0xFF);
- uint32_t Len = Word0 >> 16;
-
- CurPtr = CurPtrShadow;
-
- //===--------------------------------------==//
- // Construct the token itself.
- //===--------------------------------------==//
-
- Tok.startToken();
- Tok.setKind(TKind);
- Tok.setFlag(TFlags);
- assert(!LexingRawMode);
- Tok.setLocation(FileStartLoc.getLocWithOffset(FileOffset));
- Tok.setLength(Len);
-
- // Handle identifiers.
- if (Tok.isLiteral()) {
- Tok.setLiteralData((const char*) (PTHMgr.SpellingBase + IdentifierID));
- }
- else if (IdentifierID) {
- MIOpt.ReadToken();
- IdentifierInfo *II = PTHMgr.GetIdentifierInfo(IdentifierID-1);
-
- Tok.setIdentifierInfo(II);
-
- // Change the kind of this identifier to the appropriate token kind, e.g.
- // turning "for" into a keyword.
- Tok.setKind(II->getTokenID());
-
- if (II->isHandleIdentifierCase())
- return PP->HandleIdentifier(Tok);
-
- return true;
- }
-
- //===--------------------------------------==//
- // Process the token.
- //===--------------------------------------==//
- if (TKind == tok::eof) {
- // Save the end-of-file token.
- EofToken = Tok;
-
- assert(!ParsingPreprocessorDirective);
- assert(!LexingRawMode);
-
- return LexEndOfFile(Tok);
- }
-
- if (TKind == tok::hash && Tok.isAtStartOfLine()) {
- LastHashTokPtr = CurPtr - StoredTokenSize;
- assert(!LexingRawMode);
- PP->HandleDirective(Tok);
-
- return false;
- }
-
- if (TKind == tok::eod) {
- assert(ParsingPreprocessorDirective);
- ParsingPreprocessorDirective = false;
- return true;
- }
-
- MIOpt.ReadToken();
- return true;
-}
-
-bool PTHLexer::LexEndOfFile(Token &Result) {
- // If we hit the end of the file while parsing a preprocessor directive,
- // end the preprocessor directive first. The next token returned will
- // then be the end of file.
- if (ParsingPreprocessorDirective) {
- ParsingPreprocessorDirective = false; // Done parsing the "line".
- return true; // Have a token.
- }
-
- assert(!LexingRawMode);
-
- // If we are in a #if directive, emit an error.
- while (!ConditionalStack.empty()) {
- if (PP->getCodeCompletionFileLoc() != FileStartLoc)
- PP->Diag(ConditionalStack.back().IfLoc,
- diag::err_pp_unterminated_conditional);
- ConditionalStack.pop_back();
- }
-
- // Finally, let the preprocessor handle this.
- return PP->HandleEndOfFile(Result);
-}
-
-// FIXME: We can just grab the last token instead of storing a copy
-// into EofToken.
-void PTHLexer::getEOF(Token& Tok) {
- assert(EofToken.is(tok::eof));
- Tok = EofToken;
-}
-
-void PTHLexer::DiscardToEndOfLine() {
- assert(ParsingPreprocessorDirective && ParsingFilename == false &&
- "Must be in a preprocessing directive!");
-
- // We assume that if the preprocessor wishes to discard to the end of
- // the line that it also means to end the current preprocessor directive.
- ParsingPreprocessorDirective = false;
-
- // Skip tokens by only peeking at their token kind and the flags.
- // We don't need to actually reconstruct full tokens from the token buffer.
- // This saves some copies and it also reduces IdentifierInfo* lookup.
- const unsigned char* p = CurPtr;
- while (true) {
- // Read the token kind. Are we at the end of the file?
- tok::TokenKind x = (tok::TokenKind) (uint8_t) *p;
- if (x == tok::eof) break;
-
- // Read the token flags. Are we at the start of the next line?
- Token::TokenFlags y = (Token::TokenFlags) (uint8_t) p[1];
- if (y & Token::StartOfLine) break;
-
- // Skip to the next token.
- p += StoredTokenSize;
- }
-
- CurPtr = p;
-}
-
-/// SkipBlock - Used by Preprocessor to skip the current conditional block.
-bool PTHLexer::SkipBlock() {
- using namespace llvm::support;
-
- assert(CurPPCondPtr && "No cached PP conditional information.");
- assert(LastHashTokPtr && "No known '#' token.");
-
- const unsigned char *HashEntryI = nullptr;
- uint32_t TableIdx;
-
- do {
- // Read the token offset from the side-table.
- uint32_t Offset = endian::readNext<uint32_t, little, aligned>(CurPPCondPtr);
-
- // Read the target table index from the side-table.
- TableIdx = endian::readNext<uint32_t, little, aligned>(CurPPCondPtr);
-
- // Compute the actual memory address of the '#' token data for this entry.
- HashEntryI = TokBuf + Offset;
-
- // Optimization: "Sibling jumping". #if...#else...#endif blocks can
- // contain nested blocks. In the side-table we can jump over these
- // nested blocks instead of doing a linear search if the next "sibling"
- // entry is not at a location greater than LastHashTokPtr.
- if (HashEntryI < LastHashTokPtr && TableIdx) {
- // In the side-table we are still at an entry for a '#' token that
- // is earlier than the last one we saw. Check if the location we would
- // stride gets us closer.
- const unsigned char* NextPPCondPtr =
- PPCond + TableIdx*(sizeof(uint32_t)*2);
- assert(NextPPCondPtr >= CurPPCondPtr);
- // Read where we should jump to.
- const unsigned char *HashEntryJ =
- TokBuf + endian::readNext<uint32_t, little, aligned>(NextPPCondPtr);
-
- if (HashEntryJ <= LastHashTokPtr) {
- // Jump directly to the next entry in the side table.
- HashEntryI = HashEntryJ;
- TableIdx = endian::readNext<uint32_t, little, aligned>(NextPPCondPtr);
- CurPPCondPtr = NextPPCondPtr;
- }
- }
- }
- while (HashEntryI < LastHashTokPtr);
- assert(HashEntryI == LastHashTokPtr && "No PP-cond entry found for '#'");
- assert(TableIdx && "No jumping from #endifs.");
-
- // Update our side-table iterator.
- const unsigned char* NextPPCondPtr = PPCond + TableIdx*(sizeof(uint32_t)*2);
- assert(NextPPCondPtr >= CurPPCondPtr);
- CurPPCondPtr = NextPPCondPtr;
-
- // Read where we should jump to.
- HashEntryI =
- TokBuf + endian::readNext<uint32_t, little, aligned>(NextPPCondPtr);
- uint32_t NextIdx = endian::readNext<uint32_t, little, aligned>(NextPPCondPtr);
-
- // By construction NextIdx will be zero if this is a #endif. This is useful
- // to know to obviate lexing another token.
- bool isEndif = NextIdx == 0;
-
- // This case can occur when we see something like this:
- //
- // #if ...
- // /* a comment or nothing */
- // #elif
- //
- // If we are skipping the first #if block it will be the case that CurPtr
- // already points 'elif'. Just return.
-
- if (CurPtr > HashEntryI) {
- assert(CurPtr == HashEntryI + StoredTokenSize);
- // Did we reach a #endif? If so, go ahead and consume that token as well.
- if (isEndif)
- CurPtr += StoredTokenSize * 2;
- else
- LastHashTokPtr = HashEntryI;
-
- return isEndif;
- }
-
- // Otherwise, we need to advance. Update CurPtr to point to the '#' token.
- CurPtr = HashEntryI;
-
- // Update the location of the last observed '#'. This is useful if we
- // are skipping multiple blocks.
- LastHashTokPtr = CurPtr;
-
- // Skip the '#' token.
- assert(((tok::TokenKind)*CurPtr) == tok::hash);
- CurPtr += StoredTokenSize;
-
- // Did we reach a #endif? If so, go ahead and consume that token as well.
- if (isEndif) {
- CurPtr += StoredTokenSize * 2;
- }
-
- return isEndif;
-}
-
-SourceLocation PTHLexer::getSourceLocation() {
- // getSourceLocation is not on the hot path. It is used to get the location
- // of the next token when transitioning back to this lexer when done
- // handling a #included file. Just read the necessary data from the token
- // data buffer to construct the SourceLocation object.
- // NOTE: This is a virtual function; hence it is defined out-of-line.
- using namespace llvm::support;
-
- const unsigned char *OffsetPtr = CurPtr + (StoredTokenSize - 4);
- uint32_t Offset = endian::readNext<uint32_t, little, aligned>(OffsetPtr);
- return FileStartLoc.getLocWithOffset(Offset);
-}
-
-//===----------------------------------------------------------------------===//
-// PTH file lookup: map from strings to file data.
-//===----------------------------------------------------------------------===//
-
-/// PTHFileLookup - This internal data structure is used by the PTHManager
-/// to map from FileEntry objects managed by FileManager to offsets within
-/// the PTH file.
-namespace {
-
-class PTHFileData {
- const uint32_t TokenOff;
- const uint32_t PPCondOff;
-
-public:
- PTHFileData(uint32_t tokenOff, uint32_t ppCondOff)
- : TokenOff(tokenOff), PPCondOff(ppCondOff) {}
-
- uint32_t getTokenOffset() const { return TokenOff; }
- uint32_t getPPCondOffset() const { return PPCondOff; }
-};
-
-class PTHFileLookupCommonTrait {
-public:
- using internal_key_type = std::pair<unsigned char, StringRef>;
- using hash_value_type = unsigned;
- using offset_type = unsigned;
-
- static hash_value_type ComputeHash(internal_key_type x) {
- return llvm::djbHash(x.second);
- }
-
- static std::pair<unsigned, unsigned>
- ReadKeyDataLength(const unsigned char*& d) {
- using namespace llvm::support;
-
- unsigned keyLen =
- (unsigned)endian::readNext<uint16_t, little, unaligned>(d);
- unsigned dataLen = (unsigned) *(d++);
- return std::make_pair(keyLen, dataLen);
- }
-
- static internal_key_type ReadKey(const unsigned char* d, unsigned) {
- unsigned char k = *(d++); // Read the entry kind.
- return std::make_pair(k, (const char*) d);
- }
-};
-
-} // namespace
-
-class PTHManager::PTHFileLookupTrait : public PTHFileLookupCommonTrait {
-public:
- using external_key_type = const FileEntry *;
- using data_type = PTHFileData;
-
- static internal_key_type GetInternalKey(const FileEntry* FE) {
- return std::make_pair((unsigned char) 0x1, FE->getName());
- }
-
- static bool EqualKey(internal_key_type a, internal_key_type b) {
- return a.first == b.first && a.second == b.second;
- }
-
- static PTHFileData ReadData(const internal_key_type& k,
- const unsigned char* d, unsigned) {
- using namespace llvm::support;
-
- assert(k.first == 0x1 && "Only file lookups can match!");
- uint32_t x = endian::readNext<uint32_t, little, unaligned>(d);
- uint32_t y = endian::readNext<uint32_t, little, unaligned>(d);
- return PTHFileData(x, y);
- }
-};
-
-class PTHManager::PTHStringLookupTrait {
-public:
- using data_type = uint32_t;
- using external_key_type = const std::pair<const char *, unsigned>;
- using internal_key_type = external_key_type;
- using hash_value_type = uint32_t;
- using offset_type = unsigned;
-
- static bool EqualKey(const internal_key_type& a,
- const internal_key_type& b) {
- return (a.second == b.second) ? memcmp(a.first, b.first, a.second) == 0
- : false;
- }
-
- static hash_value_type ComputeHash(const internal_key_type& a) {
- return llvm::djbHash(StringRef(a.first, a.second));
- }
-
- // This hopefully will just get inlined and removed by the optimizer.
- static const internal_key_type&
- GetInternalKey(const external_key_type& x) { return x; }
-
- static std::pair<unsigned, unsigned>
- ReadKeyDataLength(const unsigned char*& d) {
- using namespace llvm::support;
-
- return std::make_pair(
- (unsigned)endian::readNext<uint16_t, little, unaligned>(d),
- sizeof(uint32_t));
- }
-
- static std::pair<const char*, unsigned>
- ReadKey(const unsigned char* d, unsigned n) {
- assert(n >= 2 && d[n-1] == '\0');
- return std::make_pair((const char*) d, n-1);
- }
-
- static uint32_t ReadData(const internal_key_type& k, const unsigned char* d,
- unsigned) {
- using namespace llvm::support;
-
- return endian::readNext<uint32_t, little, unaligned>(d);
- }
-};
-
-//===----------------------------------------------------------------------===//
-// PTHManager methods.
-//===----------------------------------------------------------------------===//
-
-PTHManager::PTHManager(
- std::unique_ptr<const llvm::MemoryBuffer> buf,
- std::unique_ptr<PTHFileLookup> fileLookup, const unsigned char *idDataTable,
- std::unique_ptr<IdentifierInfo *[], llvm::FreeDeleter> perIDCache,
- std::unique_ptr<PTHStringIdLookup> stringIdLookup, unsigned numIds,
- const unsigned char *spellingBase, const char *originalSourceFile)
- : Buf(std::move(buf)), PerIDCache(std::move(perIDCache)),
- FileLookup(std::move(fileLookup)), IdDataTable(idDataTable),
- StringIdLookup(std::move(stringIdLookup)), NumIds(numIds),
- SpellingBase(spellingBase), OriginalSourceFile(originalSourceFile) {}
-
-PTHManager::~PTHManager() = default;
-
-static void InvalidPTH(DiagnosticsEngine &Diags, const char *Msg) {
- Diags.Report(Diags.getCustomDiagID(DiagnosticsEngine::Error, "%0")) << Msg;
-}
-
-PTHManager *PTHManager::Create(StringRef file, DiagnosticsEngine &Diags) {
- // Memory map the PTH file.
- llvm::ErrorOr<std::unique_ptr<llvm::MemoryBuffer>> FileOrErr =
- llvm::MemoryBuffer::getFile(file);
-
- if (!FileOrErr) {
- // FIXME: Add ec.message() to this diag.
- Diags.Report(diag::err_invalid_pth_file) << file;
- return nullptr;
- }
- std::unique_ptr<llvm::MemoryBuffer> File = std::move(FileOrErr.get());
-
- using namespace llvm::support;
-
- // Get the buffer ranges and check if there are at least three 32-bit
- // words at the end of the file.
- const unsigned char *BufBeg = (const unsigned char*)File->getBufferStart();
- const unsigned char *BufEnd = (const unsigned char*)File->getBufferEnd();
-
- // Check the prologue of the file.
- if ((BufEnd - BufBeg) < (signed)(sizeof("cfe-pth") + 4 + 4) ||
- memcmp(BufBeg, "cfe-pth", sizeof("cfe-pth")) != 0) {
- Diags.Report(diag::err_invalid_pth_file) << file;
- return nullptr;
- }
-
- // Read the PTH version.
- const unsigned char *p = BufBeg + (sizeof("cfe-pth"));
- unsigned Version = endian::readNext<uint32_t, little, aligned>(p);
-
- if (Version < PTHManager::Version) {
- InvalidPTH(Diags,
- Version < PTHManager::Version
- ? "PTH file uses an older PTH format that is no longer supported"
- : "PTH file uses a newer PTH format that cannot be read");
- return nullptr;
- }
-
- // Compute the address of the index table at the end of the PTH file.
- const unsigned char *PrologueOffset = p;
-
- if (PrologueOffset >= BufEnd) {
- Diags.Report(diag::err_invalid_pth_file) << file;
- return nullptr;
- }
-
- // Construct the file lookup table. This will be used for mapping from
- // FileEntry*'s to cached tokens.
- const unsigned char* FileTableOffset = PrologueOffset + sizeof(uint32_t)*2;
- const unsigned char *FileTable =
- BufBeg + endian::readNext<uint32_t, little, aligned>(FileTableOffset);
-
- if (!(FileTable > BufBeg && FileTable < BufEnd)) {
- Diags.Report(diag::err_invalid_pth_file) << file;
- return nullptr; // FIXME: Proper error diagnostic?
- }
-
- std::unique_ptr<PTHFileLookup> FL(PTHFileLookup::Create(FileTable, BufBeg));
-
- // Warn if the PTH file is empty. We still want to create a PTHManager
- // as the PTH could be used with -include-pth.
- if (FL->isEmpty())
- InvalidPTH(Diags, "PTH file contains no cached source data");
-
- // Get the location of the table mapping from persistent ids to the
- // data needed to reconstruct identifiers.
- const unsigned char* IDTableOffset = PrologueOffset + sizeof(uint32_t)*0;
- const unsigned char *IData =
- BufBeg + endian::readNext<uint32_t, little, aligned>(IDTableOffset);
-
- if (!(IData >= BufBeg && IData < BufEnd)) {
- Diags.Report(diag::err_invalid_pth_file) << file;
- return nullptr;
- }
-
- // Get the location of the hashtable mapping between strings and
- // persistent IDs.
- const unsigned char* StringIdTableOffset = PrologueOffset + sizeof(uint32_t)*1;
- const unsigned char *StringIdTable =
- BufBeg + endian::readNext<uint32_t, little, aligned>(StringIdTableOffset);
- if (!(StringIdTable >= BufBeg && StringIdTable < BufEnd)) {
- Diags.Report(diag::err_invalid_pth_file) << file;
- return nullptr;
- }
-
- std::unique_ptr<PTHStringIdLookup> SL(
- PTHStringIdLookup::Create(StringIdTable, BufBeg));
-
- // Get the location of the spelling cache.
- const unsigned char* spellingBaseOffset = PrologueOffset + sizeof(uint32_t)*3;
- const unsigned char *spellingBase =
- BufBeg + endian::readNext<uint32_t, little, aligned>(spellingBaseOffset);
- if (!(spellingBase >= BufBeg && spellingBase < BufEnd)) {
- Diags.Report(diag::err_invalid_pth_file) << file;
- return nullptr;
- }
-
- // Get the number of IdentifierInfos and pre-allocate the identifier cache.
- uint32_t NumIds = endian::readNext<uint32_t, little, aligned>(IData);
-
- // Pre-allocate the persistent ID -> IdentifierInfo* cache. We use calloc()
- // so that we in the best case only zero out memory once when the OS returns
- // us new pages.
- std::unique_ptr<IdentifierInfo *[], llvm::FreeDeleter> PerIDCache;
-
- if (NumIds) {
- PerIDCache.reset((IdentifierInfo **)calloc(NumIds, sizeof(PerIDCache[0])));
- if (!PerIDCache) {
- InvalidPTH(Diags, "Could not allocate memory for processing PTH file");
- return nullptr;
- }
- }
-
- // Compute the address of the original source file.
- const unsigned char* originalSourceBase = PrologueOffset + sizeof(uint32_t)*4;
- unsigned len =
- endian::readNext<uint16_t, little, unaligned>(originalSourceBase);
- if (!len) originalSourceBase = nullptr;
-
- // Create the new PTHManager.
- return new PTHManager(std::move(File), std::move(FL), IData,
- std::move(PerIDCache), std::move(SL), NumIds,
- spellingBase, (const char *)originalSourceBase);
-}
-
-IdentifierInfo* PTHManager::LazilyCreateIdentifierInfo(unsigned PersistentID) {
- using namespace llvm::support;
-
- // Look in the PTH file for the string data for the IdentifierInfo object.
- const unsigned char* TableEntry = IdDataTable + sizeof(uint32_t)*PersistentID;
- const unsigned char *IDData =
- (const unsigned char *)Buf->getBufferStart() +
- endian::readNext<uint32_t, little, aligned>(TableEntry);
- assert(IDData < (const unsigned char*)Buf->getBufferEnd());
-
- // Allocate the object.
- std::pair<IdentifierInfo,const unsigned char*> *Mem =
- Alloc.Allocate<std::pair<IdentifierInfo, const unsigned char *>>();
-
- Mem->second = IDData;
- assert(IDData[0] != '\0');
- IdentifierInfo *II = new ((void*) Mem) IdentifierInfo();
-
- // Store the new IdentifierInfo in the cache.
- PerIDCache[PersistentID] = II;
- assert(II->getNameStart() && II->getNameStart()[0] != '\0');
- return II;
-}
-
-IdentifierInfo* PTHManager::get(StringRef Name) {
- // Double check our assumption that the last character isn't '\0'.
- assert(Name.empty() || Name.back() != '\0');
- PTHStringIdLookup::iterator I =
- StringIdLookup->find(std::make_pair(Name.data(), Name.size()));
- if (I == StringIdLookup->end()) // No identifier found?
- return nullptr;
-
- // Match found. Return the identifier!
- assert(*I > 0);
- return GetIdentifierInfo(*I-1);
-}
-
-PTHLexer *PTHManager::CreateLexer(FileID FID) {
- const FileEntry *FE = PP->getSourceManager().getFileEntryForID(FID);
- if (!FE)
- return nullptr;
-
- using namespace llvm::support;
-
- // Lookup the FileEntry object in our file lookup data structure. It will
- // return a variant that indicates whether or not there is an offset within
- // the PTH file that contains cached tokens.
- PTHFileLookup::iterator I = FileLookup->find(FE);
-
- if (I == FileLookup->end()) // No tokens available?
- return nullptr;
-
- const PTHFileData& FileData = *I;
-
- const unsigned char *BufStart = (const unsigned char *)Buf->getBufferStart();
- // Compute the offset of the token data within the buffer.
- const unsigned char* data = BufStart + FileData.getTokenOffset();
-
- // Get the location of pp-conditional table.
- const unsigned char* ppcond = BufStart + FileData.getPPCondOffset();
- uint32_t Len = endian::readNext<uint32_t, little, aligned>(ppcond);
- if (Len == 0) ppcond = nullptr;
-
- assert(PP && "No preprocessor set yet!");
- return new PTHLexer(*PP, FID, data, ppcond, *this);
-}
-
-//===----------------------------------------------------------------------===//
-// 'stat' caching.
-//===----------------------------------------------------------------------===//
-
-namespace {
-
-class PTHStatData {
-public:
- uint64_t Size;
- time_t ModTime;
- llvm::sys::fs::UniqueID UniqueID;
- const bool HasData = false;
- bool IsDirectory;
-
- PTHStatData() = default;
- PTHStatData(uint64_t Size, time_t ModTime, llvm::sys::fs::UniqueID UniqueID,
- bool IsDirectory)
- : Size(Size), ModTime(ModTime), UniqueID(UniqueID), HasData(true),
- IsDirectory(IsDirectory) {}
-};
-
-class PTHStatLookupTrait : public PTHFileLookupCommonTrait {
-public:
- using external_key_type = StringRef; // const char*
- using data_type = PTHStatData;
-
- static internal_key_type GetInternalKey(StringRef path) {
- // The key 'kind' doesn't matter here because it is ignored in EqualKey.
- return std::make_pair((unsigned char) 0x0, path);
- }
-
- static bool EqualKey(internal_key_type a, internal_key_type b) {
- // When doing 'stat' lookups we don't care about the kind of 'a' and 'b',
- // just the paths.
- return a.second == b.second;
- }
-
- static data_type ReadData(const internal_key_type& k, const unsigned char* d,
- unsigned) {
- if (k.first /* File or Directory */) {
- bool IsDirectory = true;
- if (k.first == 0x1 /* File */) {
- IsDirectory = false;
- d += 4 * 2; // Skip the first 2 words.
- }
-
- using namespace llvm::support;
-
- uint64_t File = endian::readNext<uint64_t, little, unaligned>(d);
- uint64_t Device = endian::readNext<uint64_t, little, unaligned>(d);
- llvm::sys::fs::UniqueID UniqueID(Device, File);
- time_t ModTime = endian::readNext<uint64_t, little, unaligned>(d);
- uint64_t Size = endian::readNext<uint64_t, little, unaligned>(d);
- return data_type(Size, ModTime, UniqueID, IsDirectory);
- }
-
- // Negative stat. Don't read anything.
- return data_type();
- }
-};
-
-} // namespace
-
-namespace clang {
-
-class PTHStatCache : public FileSystemStatCache {
- using CacheTy = llvm::OnDiskChainedHashTable<PTHStatLookupTrait>;
-
- CacheTy Cache;
-
-public:
- PTHStatCache(PTHManager::PTHFileLookup &FL)
- : Cache(FL.getNumBuckets(), FL.getNumEntries(), FL.getBuckets(),
- FL.getBase()) {}
-
- LookupResult getStat(StringRef Path, FileData &Data, bool isFile,
- std::unique_ptr<llvm::vfs::File> *F,
- llvm::vfs::FileSystem &FS) override {
- // Do the lookup for the file's data in the PTH file.
- CacheTy::iterator I = Cache.find(Path);
-
- // If we don't get a hit in the PTH file just forward to 'stat'.
- if (I == Cache.end())
- return statChained(Path, Data, isFile, F, FS);
-
- const PTHStatData &D = *I;
-
- if (!D.HasData)
- return CacheMissing;
-
- Data.Name = Path;
- Data.Size = D.Size;
- Data.ModTime = D.ModTime;
- Data.UniqueID = D.UniqueID;
- Data.IsDirectory = D.IsDirectory;
- Data.IsNamedPipe = false;
- Data.InPCH = true;
-
- return CacheExists;
- }
-};
-
-} // namespace clang
-
-std::unique_ptr<FileSystemStatCache> PTHManager::createStatCache() {
- return llvm::make_unique<PTHStatCache>(*FileLookup);
-}
#include "clang/Lex/PPCallbacks.h"
#include "clang/Lex/Preprocessor.h"
#include "clang/Lex/PreprocessorLexer.h"
-#include "clang/Lex/PTHLexer.h"
#include "clang/Lex/Token.h"
#include "clang/Lex/TokenLexer.h"
#include "llvm/ADT/ArrayRef.h"
void Preprocessor::HandlePragmaMark() {
assert(CurPPLexer && "No current lexer?");
- if (CurLexer)
- CurLexer->ReadToEndOfLine();
- else
- CurPTHLexer->DiscardToEndOfLine();
+ CurLexer->ReadToEndOfLine();
}
/// HandlePragmaPoison - Handle \#pragma GCC poison. PoisonTok is the 'poison'.
DiscardUntilEndOfDirective();
}
- if (CurPTHLexer) {
- // FIXME: Support this somehow?
- Diag(Loc, diag::err_pp_module_build_pth);
- return;
- }
-
CurLexer->LexingRawMode = true;
auto TryConsumeIdentifier = [&](StringRef Ident) -> bool {
#include "clang/Lex/MacroArgs.h"
#include "clang/Lex/MacroInfo.h"
#include "clang/Lex/ModuleLoader.h"
-#include "clang/Lex/PTHLexer.h"
-#include "clang/Lex/PTHManager.h"
#include "clang/Lex/Pragma.h"
#include "clang/Lex/PreprocessingRecord.h"
#include "clang/Lex/PreprocessorLexer.h"
PragmaHandlers = std::move(PragmaHandlersBackup);
}
-void Preprocessor::setPTHManager(PTHManager* pm) {
- PTH.reset(pm);
- FileMgr.addStatCache(PTH->createStatCache());
-}
-
void Preprocessor::DumpToken(const Token &Tok, bool DumpFlags) const {
llvm::errs() << tok::getTokenName(Tok.getKind()) << " '"
<< getSpelling(Tok) << "'";
void Preprocessor::recomputeCurLexerKind() {
if (CurLexer)
CurLexerKind = CLK_Lexer;
- else if (CurPTHLexer)
- CurLexerKind = CLK_PTHLexer;
else if (CurTokenLexer)
CurLexerKind = CLK_TokenLexer;
else
case CLK_Lexer:
ReturnedToken = CurLexer->Lex(Result);
break;
- case CLK_PTHLexer:
- ReturnedToken = CurPTHLexer->Lex(Result);
- break;
case CLK_TokenLexer:
ReturnedToken = CurTokenLexer->Lex(Result);
break;
PPOpts.UsePredefines = Record[Idx++];
PPOpts.DetailedRecord = Record[Idx++];
PPOpts.ImplicitPCHInclude = ReadString(Record, Idx);
- PPOpts.ImplicitPTHInclude = ReadString(Record, Idx);
PPOpts.ObjCXXARCStandardLibrary =
static_cast<ObjCXXARCStandardLibraryKind>(Record[Idx++]);
SuggestedPredefines.clear();
// Detailed record is important since it is used for the module cache hash.
Record.push_back(PPOpts.DetailedRecord);
AddString(PPOpts.ImplicitPCHInclude, Record);
- AddString(PPOpts.ImplicitPTHInclude, Record);
Record.push_back(static_cast<unsigned>(PPOpts.ObjCXXARCStandardLibrary));
Stream.EmitRecord(PREPROCESSOR_OPTIONS, Record);
+++ /dev/null
-// Test transparent PTH support.
-
-// RUN: %clang -no-canonical-prefixes -ccc-pch-is-pth -x c-header %s -o %t.h.pth -### 2> %t.log
-// RUN: FileCheck -check-prefix CHECK1 -input-file %t.log %s
-
-// CHECK1: "{{.*[/\\]}}clang{{.*}}" "-cc1" {{.*}} "-o" "{{.*}}.h.pth" "-x" "c-header" "{{.*}}pth.c"
-
-// RUN: touch %t.h.pth
-// RUN: %clang -no-canonical-prefixes -ccc-pch-is-pth -E -include %t.h %s -### 2> %t.log
-// RUN: FileCheck -check-prefix CHECK2 -input-file %t.log %s
-
-// CHECK2: "{{.*[/\\]}}clang{{.*}}" "-cc1" {{.*}}"-include-pth" "{{.*}}.h.pth" {{.*}}"-x" "c" "{{.*}}pth.c"
+++ /dev/null
-// RUN: %clang_cc1 -triple i386-unknown-unknown -emit-pth -o %t1 %s
-// RUN: %clang_cc1 -triple i386-unknown-unknown -emit-pth -o - %s > %t2
-// RUN: cmp %t1 %t2
-// RUN: %clang_cc1 -triple i386-unknown-unknown -emit-pth -o - %s | \
-// RUN: FileCheck %s
-
-// CHECK: cfe-pth
+++ /dev/null
-// RUN: %clang_cc1 -triple i386-unknown-unknown -emit-pth -o %t %S/pth.h
-// RUN: not %clang_cc1 -triple i386-unknown-unknown -include-pth %t -fsyntax-only %s 2>&1 | FileCheck %s
-
-#error This is the only diagnostic
-
-// CHECK: This is the only diagnostic
-// CHECK: 1 error generated.
+++ /dev/null
-// RUN: %clang_cc1 -emit-pth %s -o %t
-// RUN: %clang_cc1 -include-pth %t %s -E | grep 'file_to_include' | count 2
-#include "file_to_include.h"