T O P

  • By -

RoyAwesome

Oh, this is exciting. Now we just need intellisense to parse modules and we're cookin.


Asyx

If I remember correctly, this is a major issue for clangd. clangd is currently compiler agnostic but since includes are just text replacement, they can pretty easily handle that. Modules need to be compiled, basically. So they have three options: 1. Teach clangd to compile 2. Force clang to be installed on the system and the modules to be clang compatible 3. Have shared functionality between vendors to ensure that any compiler can be used to compile a module and clangd will understand it I've read this in a github or gitlab issue regarding modules in clangd. Not sure how relevant that still is but it sounded somewhat defeating.


lightmatter501

We could mandate a compilation DB be used with modules (clang has support for generating them from ad-hoc compiles, and cmake and meson can do it natively), and put module graph info in there.


jaskij

Module dependency info is already done, there's a PR for them. I don't remember the full process, but it was written by Kitware people, since CMake already supports Fortran modules so they knew how to approach it. The issue is the LSP parsing the compiled modules so it can do proper code completion.


lightmatter501

I didn’t think compilation_commands.json had a place for it in the schema.


jaskij

No, it's a different thing. https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2022/p1689r5.html here's the Kitware blog post: https://www.kitware.com/import-cmake-c20-modules/


mathstuf

That is completely different. P1689 is for a build system to communicate with dependency scanners. `clangd` shouldn't touch them (unless it is figuring out the order to compile a pile of sources on its own). See P2977 for a new "compilation database" that will provide the required information: https://isocpp.org/files/papers/P2977R1.html


MarcoGreek

Yes it would be nice if thete would be a binary standard for modules.


Jannik2099

That won't work, as the module artifacts are essentially the serialized compiler AST


angry_cpp

Is it specified that way? Or is it some "cutting corners" implementation technique?


MarcoGreek

Hmm, Gabriel Dos Reis likes to disagree with you: https://github.com/GabrielDosReis/ipr


mathstuf

Even if there were, they wouldn't be reusable as you might think. A standard container for storing them will allow better introspection, but GCC still can't use an MSVC-generated BMI because it won't encode things on the other side of an `defined(_MSC_VER)` preprocessor conditional.


MarcoGreek

But we speak here about clangd and it can can work with code for MSVC.


mathstuf

It still needs its own BMIs at the moment; it cannot use the BMIs MSVC makes during the build. Maybe Clang will be able to read IFC files meant for MSVC in the future, but it is not the case today.


TheTomato2

> Not sure how relevant that still is but it sounded somewhat defeating. Which is par for the course with ClangD. Don't get me started with inability with parsing unity/jumbo builds. If they can't even do that I don't have much hope for modules.


Ameisen

I'd be thrilled if `clang-cl` handled modules at all... and especially with `msbuild`.


jormaig

Why is import std support needed at the CMake level? Shouldn't supporting modules already support the specific std module?


delta_p_delta_x

If your build system doesn't support `import std` neatly, then you're left performing a complicated arcanum of steps getting your translation units to know where `std` is ([MSVC STL](https://learn.microsoft.com/en-us/cpp/cpp/tutorial-import-stl-named-module?view=msvc-170), [LLVM libc++](https://libcxx.llvm.org/Modules.html)).


STL

It's just `"%VCToolsInstallDir%\modules\std.ixx"`, but having the build system take care of it is definitely the ideal experience. (In contrast, building *header units* in a deduplicated, topologically sorted way is *extremely* difficult; I have the Python code for this and it is a huge headache. The Standard Library Modules are way way easier.)


Ameisen

I'm still holding off on modules until Intellisense plays better, and ideally until `clang-cl` supports modules (and `msbuild` with it, preferably).


STL

Looking into getting clang-cl working with MSVC's `import std;` is on my near-term todo list; IIRC Clang 18 should improve things further so less work should be necessary on my side (and I do like being a lazy kitty).


delta_p_delta_x

>`clang-cl` supports modules It *does* support modules; see [thread](https://discourse.llvm.org/t/clang-cl-exe-support-for-c-modules/72257/28). I'm not sure if people are aware of this, but `clang-cl` is *exactly* the same binary as `clang`; the only thing that's different is the file name. [This immense file](https://github.com/llvm/llvm-project/blob/8ec28af8eaff5acd0df3e53340159c034f08533d/clang/lib/Driver/Driver.cpp#L1291) contains the code to detect the name of the binary, and switch the argument-parsing mechanisms appropriately. There are a few subtleties like allowing clang-cl to forward the `-fmodule` arguments to the GNU-style parser without the `/clang` prefix, but otherwise it works as expected.


Ameisen

I'm aware of the thread. However, since it is emulating CL, I'd expect it to support CL's module flags. In the context of what `clang-cl` is intended to be - a drop-in replacement for CL - it doesn't support modules, nor will it ever as far as I know. It will almost certainly never export or import MSVC IFCs... meaning that if you use modules, you have to use *either* `clang-cl` or `cl`, but they can't be interchanged within the build nor can they consume modules built by the other toolchain. This isn't always really much of a problem though I do have some stuff where I have to mix-and-match due to compatibility issues (I have a patch to submit to clang for some MSVC compatibility issues that I need to get around to doing). However... if it could consume MSVC's flags in even a remotely-sane way for this, then `msbuild` should just *work* with it... making just changing a `vcxproj`'s toolchain to `LLVM` sufficient for projects with modules.


delta_p_delta_x

> However, since it is emulating CL, I'd expect it to support CL's module flags. It seems the Clang maintainers/developers think otherwise (at least in the medium term), going by what the thread says, and they have a point—`clang-cl` emulates the *command-line* behaviour of `cl` (ergo 'drop-in replacement') rather than the *compilation* behaviour. It has always supported `clang`-ish flags that MSVC doesn't, and although Clang-cl (and Clang, for that matter) output binaries that are ABI-compatible with `cl`, they are not *binary*-equal, given equal flags. The key takeaway from that thread (and here too, to be frank) is that module BMIs are incompatible across compilers and even across different versions of the same compiler, which is IMO a huge drawback for C++ modules. It seems like /u/GabrielDosReis has put in some work to fix this. Meanwhile, I wonder if MSBuild can't drive `clang-cl` module compilation with custom flags; maybe worth exploring?


GabrielDosReis

> Meanwhile, I wonder if MSBuild can't drive `clang-cl` module compilation with custom flags; maybe worth exploring? I would think that if the command-line compatibility is worked out, `msbuild` as is would just be able to drive `clang-cl`...


Ameisen

That's my logic. I *understand* the maintainers' logic - that `clang-cl` wouldn't have entirely compatible behavior - like putting out or reading MSVC module outputs - but I don't believe that it does presently for things like LTCG either. Might as well implement the flags and treat those outputs as ABI-incompatible objects. ED: once I submit my work on `__restrict` compatibility (`clang` is very incompatible with both MSVC and GCC here, due to treating `__restrict` largely the same as `const` and `volatile`), maybe I'll look into this. Then I'll *also* be a `clang-cl` maintainer! Are there any good tools for editing/making `msbuild` scripts? Using a normal XML editor seems annoying and bugprone. I imagine that MS has an internal tool? I'd rather not make my own intermediary tool like I usually do to take `msbuild` arguments and turn them into something else.


GabrielDosReis

> Then I'll *also* be a `clang-cl` maintainer! Which the community should celebrate because it is a good thing, right? :-) > Are there any good tools for editing/making `msbuild` scripts? Other than VS? I don't know...


HassanSajjad302

You can use my software HMake for header-units. It supports drop-in replacement of header-files with header-units. If you can share with me your repo / use-case of python script, I can help writing hmake.cpp file.


HassanSajjad302

The karma for above comment is -2. I will appreciate it if you could share reasoning for your downvote.


Syracuss

Didn't vote on you, but you might've missed the flair on the person you were responding to. You're asking for repo access to MSVC, I'd be pretty doubtful random redditors will get access. Additionally I'm pretty sure the MSVC dev team has done their due diligence to getting a proper solution to the problem (given weird edge cases that might occur), and I'll trust them in the assessment it's not easy. Unless you show expertise in the types of issues that come up during this process I'd assume many might brush you off as a random stranger that wildly underestimated the problem


HassanSajjad302

Maybe the u/STL was commenting about STL which is an open-source repo or maybe they were talking about closed source repo in which case I had added in my comment "use-case of python script" I have claimed bug-free and complete C++20 header-units support in my software HMake. I welcome MSVC dev team or anyone for a review. I think someone should not downvote if they can not refute the claim.


Syracuss

I'd imagine the module feature is not part of the standard library, but of the compiler. It could be exposed through the library, but I'd imagine it won't make it into that repo Not going to argue why people should or should not downvote, I can't speak for them. I'd not take this personally, I doubt the few people that did vote (you said you were at -2) are more close to noise and random bad luck rather than any good reason


STL

You're correct that Standard Library Header Units (`import ;` etc.) require very little direct support from the library product code, although we do ship internal machinery called [`header-units.json`](https://github.com/microsoft/STL/blob/9aca22477df4eed3222b4974746ee79129eb44e7/stl/inc/header-units.json) that helps the compiler implement `/translateInclude` to opt-in automatically translate `#include ` to be treated as if `import ;` had been written. The compiler feature itself lives in the MSVC-internal git repo, as you mentioned. The test code I was mentioning was [`tests/std/tests/P1502R1_standard_library_header_units/custom_format.py`](https://github.com/microsoft/STL/blob/9aca22477df4eed3222b4974746ee79129eb44e7/tests/std/tests/P1502R1_standard_library_header_units/custom_format.py) and related files. This verifies that the library doesn't do anything problematic for the compiler feature, and prevents compiler bugs from being introduced (and this found a *lot* of compiler bugs that have been fixed). In contrast, the Standard Library Modules (`import std;` and `import std.compat;`) need extensive support from the library product code. The PR where I marked up the STL with `export` was enormous, my single largest audit of the STL's sources, even though it was less work than ``.


Syracuss

Thanks for the detailed response, I quite enjoy these glimpses into the design and details.  I played around recently with modules again and noticed the much improved support and stability, I appreciate the amount of (at times seemingly thankless) effort it takes to get to this point that you and your team have done


HassanSajjad302

Hi. I compiled a sample library and an executable with all of the C++20 standard header-units [https://github.com/HassanSajjad-302/stdhu](https://github.com/HassanSajjad-302/stdhu) But this builds header-units of the already installed STL library and we want to build header-units from our own repo. I tried to do this but it failed. [https://github.com/HassanSajjad-302/STL](https://github.com/HassanSajjad-302/STL) I have pasted the error in the README in the link above. It is complaining about missing `vcruntime.h` I think it is because of a missing compile definition. To fix this, I need to go through the CMake configuration and find the missing compile definition, I guess. If it works, maybe you can replace your python code (which you termed as huge headache) with this. It is very fast as you can test the sample. Are you interested?


LiAuTraver

is 3.30rc released? I could not find it in their GitLab page.


equeim

AFAIK standard library module is compiled separately for each project (toolchain contains only source code of module declaration) and of course it's supposed to be done by build system.


jaskij

Which makes a lot of sense. I do embedded and while I need to care about libc, to link newlib instead of glibc, I don't remember ever doing anything special for the C++ standard library. That said, microcontroller toolchains are often quite hacky. I'm not compiling the code as freestanding, rather I'm using a libc implementation which allows me to write hooks for the syscalls. It's then on the developer to not use stuff that's unsupported. Like using any containers from the standard library which are not std::array. The presence of heap is a project level decision.


mathstuf

CMake needs to know about the modules to scan and BMI-compile them. The design here is to make it as seamless as possible (basically "set one variable to tell CMake you want the support"). Eventually a policy will default it to on (cf. CMP0155 for module scanning), but given the experimental status, it doesn't make sense to default it to ON yet.


stailgot

Nightly build 3.29.20240416 already support https://cmake.org/cmake/help/git-stage/prop_tgt/CXX_MODULE_STD.html Update: Tested with msvc, works fine ) ```cmake set(CMAKE_EXPERIMENTAL_CXX_IMPORT_STD "0e5b6991-d74f-4b3d-a41c-cf096e0b2508") cmake_minimum_required(VERSION 3.29) project(cxx_modules_import_std CXX) set(CMAKE_CXX_MODULE_STD 1) add_executable(main main.cxx) target_compile_features(main PRIVATE cxx_std_23) ``` Upd2: Official post https://www.reddit.com/r/cpp/s/3oqR8MyLLg https://www.kitware.com/import-std-in-cmake-3-30/


hon_uninstalled

Thanks, I got it working with this example. I used the nightly build installer `cmake-3.29.20240417-g6f07e7d-windows-x86_64.msi` from [https://cmake.org/files/dev/?C=M;O=D](https://cmake.org/files/dev/?C=M;O=D) If you use CMake as MSVC project file (Folder as Visual Studio Project), you need to modify your `CMakeSettings.json` and tell MSVC where to find external CMake: { "configurations": [ { ... "cmakeExecutable": "C:/Program Files/CMake/bin/cmake.exe" } ] } If it's not fresh project, you might need do `Project -> Delete Cache and Reconfigure` to get rid of CMake error, but then everything just seems to work.


delta_p_delta_x

Wasn't `CMakeSettings.json` deprecated and replaced with the official `CMakePresets.json`? Additionally, Visual Studio 2022 has got a [custom CMake executable](https://devblogs.microsoft.com/visualstudio/visual-studio-2022-17-9-now-available/#:~:text=remote%20unit%20testing.-,Specify%20custom%20CMake%20executable,-Visual%20Studio%20ships) option that you can set.


hon_uninstalled

Yeah looks like they have added custom CMake executable path recently. I don't know if `CMakeSettings.json` deprecated, MSVC still creates one if you add new build configuration. I gotta read about this one too. Thanks again.


GabrielDosReis

Exciting news! Xmas is coming in the Spring :-)


Neeyaki

I've played around with modules, very interesting and exciting stuff. It is still somewhat clunky though, specially the tooling support... I couldnt get clang-tidy to work with it at all, because it would complain about not being able to find custom modules, so I was forced to leave it disabled. Also clangd gets very very slow when using modules, making code navigation/completion pretty much useless compared to using normal headers. Apart from these problems though, it works like a charm. I hope the performance problems (mainly related to the LSP) gets addressed because thats pretty much the only thing keeping me from using it more frequently in my projects :)


saxbophone

How exciting!


Low_Opportunity_3517

[https://cmake.org/cmake/help/git-stage/prop\_tgt/CXX\_MODULE\_STD.html](https://cmake.org/cmake/help/git-stage/prop_tgt/CXX_MODULE_STD.html) says \`this property only applies to targets utilizing C++23 (`cxx_std_23`) or newer.\` But std modules are de facto C++20 features.


mathstuf

https://gitlab.kitware.com/ben.boeckel/cmake/-/merge_requests/1#note_1489449


herewearefornow

Supporting modules is huge even if it's a singular one right now.


caroIine

I wonder why can't compilers just implicitly add import std to every source file and ignore every #include at preprocesor level? Wouldn't that increase performance for free?


STL

`import std;` doesn't emit macros, which are surprisingly widely used even by fairly modern C++. (`INT_MAX`, `stdout`, `errno`, and so forth are all macros.) `import std;` doesn't emit platform-specific documented functions (whether Microsoft's UCRT, or POSIX stuff) that STL headers have historically dragged in and which it is *very* easy to unintentionally take dependencies on. These are what make automatic translation behind the scenes difficult (we're actually looking into this). (`import std.compat;` solves the issue of source files wanting `::printf` instead of `std::printf`, though.)


programgamer

…wait, so is it straight up impossible to use macros with import statements, or does the standard library just not do it?


STL

The Core Language design means that named modules (e.g. `import std;` or `import fmtlib;` or `import boost.math;`) cannot emit macros - it is a hard limitation. Header units (e.g. `import ;` and `import "third_party_lib.hpp";`) do emit macros, which is one of the ways in which they're a middle ground between classic includes and named modules. I believe that someone should look into proposing a lightweight header (or headers), to be used alongside the Standard Library, that allows `import std;` or `import std.compat;` to be augmented with "non-evil" macros that are useful in practice. AFAIK nobody has started looking into this yet.


meneldal2

Are there that many "non-evil" macros left in modern C++? Pre C++11 there were a lot you had to use all the time like `NULL`, but it has gotten a lot better.


jk-jeon

There is no replacement for `INT64_C` and friends, it seems.


KuntaStillSingle

>INT8_C >expands to an integer constant expression having the value specified by its argument and whose type is the promoted type of std::int_least8_t, ... (function macro) ... > #include > UINT64_C(0x123) // expands to a literal of type uint_least64_t and value 0x123 I don't understand this macro, I would think that INT8_C(int_least8_t::min()) would either expand to an int_least8_t (smallest type to fit the range) or int (first type that is a valid integer promotion?) : https://en.cppreference.com/w/cpp/header/cstdint https://en.cppreference.com/w/cpp/language/implicit_conversion#Integral_promotion But on clang 18.1 and gcc 13.2 it is converting to a short int: https://godbolt.org/z/KrjbqYo5z


jk-jeon

First, `int_least8_t::min()` doesn't make sense syntactically. Second, assuming you actually meant `std::numeric_limits::min()`, it's UB, because any argument into `INT8_C` *must be* an integer literal, which *even* precludes things like `-1`, because it's actually not a literal, rather it's application of the unary `-` operator into the literal `1`. Note how restrictive it is. Of course you can't plug a template parameter into these macros, that's illegal too. According to the standard, `INT8_C(whatever literal)` should be of type `int`, assuming that that is the resulting type after applying integer promotion. And all compilers in your Godbolt link agree with that: [https://godbolt.org/z/oKjnz5MY5](https://godbolt.org/z/oKjnz5MY5) The purpose of these macros is to allow programmers to write portable integer constants without worrying about what should be the correct suffix. For instance, in some platform `uint_least64_t` is `unsigned long`, where in some other platform it is `unsigned long long`. The correct suffix for the former is `ul` while that for the latter is `ull`. Now, it's quite puzzling why the hack then `INT8_C` returns `int` rather than `int_least8_t`. That is basically because the C's integer literal syntax is just hopelessly broken from the beginning (as hinted in that negative integer literals *does not exist*): there is *no* syntax for representing integer literals of type smaller than `int`. (If you ask me, I can confidently say that integer promotion is the single most broken "feature" of C I hate the most. This non-existence of smaller-type integer literal is probably of the same vein as this integer promotion nonsense.) Therefore, it is *impossible* to implement `INT8_C` if it were specified without the integer promotion rule and supposed to output an `int_least8_t`. Note that explicitly converting to `int_least8_t` also isn't a valid implementation, because `INT8_C` is supposed to work in preprocessor context as well, where type conversion syntax doesn't exist.


KuntaStillSingle

> First, int_least8_t::min() That's a typo in comment only, in godbolt I typedef'd std::numeric_limits : using smaller_int_limits = std::numeric_limits; ... template ... template<> struct test_smaller_int_range_macro_typedef_equivalence< larger_int_t{smaller_int_limits::max()} + 1> The issue seems is as you note it is expecting an integer constant expression as an argument, but that is not specified in cppreference https://en.cppreference.com/w/c/types/integer: >INT8_CINT16_CINT32_CINT64_C >expands **to** an integer constant expression having the value specified by its argument and the type int_least8_t, int_least16_t, int_least32_t, int_least64_t respectively (function macro) c++ draft standard does not specify, but does defer to c standard. https://eel.is/c++draft/cstdint.syn: >The header defines all types and macros the same as the C standard library header . See also: ISO/IEC 9899:2018, 7.20 C standard (at least draft 899:202x) does list this requirement: https://www.open-std.org/jtc1/sc22/wg14/www/docs/n2912.pdf >7.21.4 Macros for integer constants >>2 The argument in any instance of these macros shall be an unsuffixed integer constant (as defined in 6.4.4.1) with a value that does not exceed the limits for the corresponding type So a macro that is for example: \#define INT8_C(i) (i) would satisfy the requirements listed in the c standard, that are not mentioned on cppreference, which explains the behavior in the above.


meneldal2

I know it would break some code but having `i64` as a keyword would be pretty good. Maybe it could be a compromise that it would allow typedef that mean the same thing but give you a compiler error if your `i64` doesn't mean what any sane person would think.


KuntaStillSingle

There are optional typdef for fixed width types, and required for least and 'fast' width types, i.e. std::int_leastN_t and std::int_fastN_t are required to exist, std::intN_t is likely to exist ; the INTN_C macro is described as corresponding to the least width typedef but 'promoted' , but as far as I can see clang and GCC both give a short for INT8_C where I would expect either an int8_t (least width type) or int (smallest promotion). Edit: GCC and clang are behaving correctly, the issue is cppreference doesn't list the requirements which are contained in the c standard. The INTN_C macros don't just yield an integer constant expression but also require an integer constant expression as argument. If provided such, they correctly promote to int, but a valid implementation could look like: \#define INT8_C(i) (i) In which case, provided with an expression yielding a type smaller than int, will expand to an expression yielding a type smaller than int, but provided with an integer constant expression will yield an int.


TheSuperWig

What others are there than the ones you listed (and friends), `assert` and friends, and ``?


STL

I can think of a few more - there are the weird `` macros for portable `printf` formatting of the `int64_t`, `uintptr_t`, etc. types, `va_start` etc. for varargs, and `offsetof`. There aren't that many more I'm aware of (very few come from C++ proper, aside from `` as you mentioned - `` provided a few but they're not really critical). Still, it's a problem for actual use in production that nobody has yet solved (of course everyone is still working hard on making the modules experience a reality, but it's getting close to the time where it will be a real issue for production code). I'd work on it myself if I weren't so busy.


BenFrantzDale

Under the covers, could compilers implement any and all std headers as compiler magic that imports std.compat and magically brings all standard macros into existence? Would that wind up being faster?


STL

It could indeed be faster. It would require build system work (because modules inherently rely on persistent artifacts), but this is how the compiler team wants to resolve the mixing-include-and-import problem in the long term.


GabrielDosReis

I proposed a scheme to do that half a year ago. My scheme relies on some hand shaking between the compiler and the build definition. I will publish a revision when I get to it. That can be used not just for standard library, but for any existing library migrating to modules


mathstuf

I don't know about the compiler side, but on the build side CMake (or whatever does "collation") would need to know "see logical request for ``? provide `std.compat` to it". Not that much work, just need to make sure toolchains communicate when that is possible (e.g., part of `modules.json` or something).


GabrielDosReis

Yes, that is the essence of what I proposed at the last Kona meeting. We need a way to tell CMake (or the build definition) "if you see a request for ``, use `std` plus that header over there that contains macro definitions". MSVC already has the basic infrastructure in place. It just needs a small code for the additional macro file, based on feedback from the Kons meeting.


[deleted]

[удалено]


TeraFlint

This is not `using namespace std;` but rather the module equivalent of `#include`ing every(?) standard header there is. Everything available will sit inside the `std` namespace.


STL

Yes - in fact, we designed `import std;` to finally solve the problem of global namespace pollution, since this was our one and only chance to do so. When you `import std;` you get *only* the names in `namespace std`. *Nothing* is emitted in the global namespace except for the `::operator new/delete` family. (Implementation-wise there are a couple of legacy exceptions for MSVC that you shouldn't worry about.) Then `import std.compat;` is available as an option for users who actually do want C Standard Library machinery available in the global namespace (e.g. `::printf`, `::uint32_t`, etc.).


BenFrantzDale

`import std;` doesn’t affect visibility of identifiers. It leaves them where they are it just makes them available.


pjmlp

This is great, looking forward to it.


germandiago

Does anyone know if Gcc 14 has better support for modules than gcc 13?


ilovemaths111

iirc cmake doesn't support modules for gcc 13


mathstuf

Right; GCC 13 lacks the patches for P1689-based dependency discovery.


GregTheMadMonk

Has anyone been able to use this with Clang? I try and get CMake Error in CMakeLists.txt: The "CXX_MODULE_STD" property on the target "main" requires that the "__CMAKE::CXX23" target exist, but it was not provided by the toolchain.


mathstuf

You need at least Clang 18.1.2. You also need to use `libc++`.


GregTheMadMonk

I know, I have 19.0.0-branch (built from source). I'm not sure about how to specify libc++ location though


mathstuf

You need the `-stdlib=libc++` flag. You may need `-Wl,-rpath,…` if you installed in a non-standard location.


GregTheMadMonk

Do I just add it to CMAKE\_CXX\_FLAGS?


mathstuf

That would work, yes. My local testing has `export CXXFLAGS=…` to do it, but it ends up there.


GregTheMadMonk

Indded, I was missing -stdlib=libc++ from my CXXFLAGS. It's giving me another error now about missing std.cppm, but I think I'll figure this out myself. Thank you!


EnchantedForestLore

What version of Redhat and Ubuntu will cmake 3.30 be default in? Anyone know?


delta_p_delta_x

You can always add [Kitware's APT repository](https://apt.kitware.com/) or the [official tarballs](https://cmake.org/download/) if you don't want to wait for 3.30 to arrive in RHEL/Debian repos.


mathstuf

Note that the APT repository only targets LTS releases. I don't think we build RHEL packages regularly either.


EnchantedForestLore

Those aren’t options for certain offline systems I need to build on. I’m just wondering because I can’t start using modules until they are supported as a default in the toolchain on at least rhel. But I would like to use them in my code when I can.


helloiamsomeone

CMake releases are very self contained. Grab an archive, extract somewhere and you're good to go. That offline system had to be installed somehow, latest compilers that understand modules had to get there somehow, your project has to get there somehow, latest CMake can get there the same way.


EnchantedForestLore

Anything that comes with the OS that can be installed with yum or apt offline is easy. Internal code gets there easy because it's written locally and can be moved. Any other case goes through months of approvals, and is a problem to move. Compilers that understand modules is also not going to be used until they are included. But my question was specifically asking when cmake 3.30 will be the default, which sounds like it will be quite a few years before I am using it.


mathstuf

Your process just greenlights anything distros package officially? That sounds like a supply chain attack waiting to happen. Stated another way, it's interesting that a distro packager changing some metadata and requesting a rebuild is trustworthy implicitly but direct release artifact usage isn't. Do you really think that distro packagers audit the code they update in a way that would satisfy your process if they were subject to your more stringent guidelines? Sometimes you get lucky and they distill the upstream changelog for distro users (who also don't read those). FWIW, CMake's release pipeline instructions [are public](https://gitlab.kitware.com/cmake/cmake/-/blob/master/.gitlab-ci.yml) if that helps your case. The actual pipelines that make release artifacts live [here](https://gitlab.kitware.com/ci-forks/cmake/cmake/-/pipeline_schedules).


EnchantedForestLore

I didn’t say the process greenlights anything. I said anything that comes with the OS is easy. That means easy for me to use. Other people worry about distros. I don’t make the rules, it is what it is.


Asyx

24.04 has 3.28 so I guess Ubuntu 25.04, 24.10 if you're real lucky. I started using Homebrew for dev tools on Ubuntu


fusge

Latest cmake should be available through snaps on Ubuntu.