In such a system, the library lag
behind language features even more so than it currently does.
I don't believe this. Many if not most of the standard library
components appeared other places first. Most of boost, type traits,
STL, etc. Very little of the standard library was created by the
What does that have to do with anything I was talking about?
You suggested that library lags behind language features. I don't this
is very true. In fact, I'm of the believe that library components come
first and motivate enhancements to the language itself. For example,
functions objects where found to be very useful but syntactically
cumbersome. This motivated addition of lambdas to the language.
STL's `vector` type didn't have move support because move support
/didn't exist/ when STL was created. It didn't have `emplace` because
forwarding references and variadic templates /didn't exist/ back then.
And so forth.
Right. It was only apparent that these facilities were useful when
these libraries were fairly widely used. The language standard follows
code and practice. For the committee to be spending too much time trying
to speculatively anticipate the future is not an effective usage of
That is the kind of lagging behind I'm talking about. C++11 brought
about those language features, and C++11 simultaneously added those
features to the standard library components that could use them. That's
Those features are/were added to all libraries and user code - not just
those designed by the committee.
Separating the language from the library will make it difficult for that
to happen. That's bad.
No it won't
When the standards committee gets involved - it hobbles progress.
Ranges, Networking. Very complex components. They been available for
years, while the standards committee versions won't be available for
years to com.
You can use that argument for pretty much any big feature, language or
LOL - of course I can, and I do - because it's a good argument.
I mean, before C++11 came out, GCC has a nearly-complete
implementation of the large Concepts proposal. But the committee decided
not to move forward with it, so progress was hobbled for a good 7 years
As you point out, the idea of concepts is an example where design by
committee has worked out well.
Most if not all the benefits of the future concepts implementation can
and have been implemented as libraries. The Boost Concept Library
(2004) was somewhat crude but effective. The Tick library ( proposed to
boost but not accepted) includes a quite complete and effective
implementation based on C++11. But neither of these libraries are widely
used in spite of a large amount of evangelism at conferences. Of the
very few libraries which actually have documentation, only a few
actually use concepts in their designs - much less enforce them in their
code. In spite of all this, the C++ community has spent huge amounts of
effort promoting and designing this feature as part of the language.
Much better would have be to just let libraries implementing concepts
develop. If they become successful and it becomes apparent that a
change in the language would make them better, the the language can be
enhanced to better support them. This is similar to your example for
vector where vector was invented, found wide usage, and motivate some
enhancements in the language (move).
The same goes for many other things C++ features. If we just let
compilers independently create their own separate language features,
then C++ would advance much more quickly.
I have not proposed this.
but compiler vendors have proposed their own incompatible
features/extensions. The committee cannot and does not prohibit this.
And it would advance in incompatible ways. Standardization of the
language creates standardization of implementations. The same is true of
the standard library.
under my proposal the standard library would be smaller. But all
implementations would be compatible.
But that's only true if the library doesn't depend on the platform.
I've proposed limiting library standardization to those things which are
platform dependent such as i/o (fopen, etc.) to facility the writing of
A problem with dropping things from the standard is that it creates a
lack of interoperability. Take the entire STL iterator mechanism. I can
write new algorithms that work within the iterator model, and someone
else can write new containers that provide iterators within that model.
Because we're both coding against the same standard model, their
containers can work with my algorithms. That's good.
Right. In my world standard components would still exist. Someone
would make them and they would become "effectively" standard in that
many people come to depend upon them. ASIO, Eigan are good examples. So
"standard components" will always exist. But this doesn't mean that the
ISO C++ standards committee has to be responsible for designing all of
them. I argue that it doesn't have to spend time on this, that it's not
a good use of their time, and it doesn't result in the most effective
Take `optional<T>`. Because it's available to everyone, it becomes a
lingua franca type that everyone can use. You can stick it in an
interface and not feel like you're making someone else use something
they'd rather not use. Something similar can be said for `string_view`.
That's good too.
Boost::Optional has existed for many years and has wide use. Spending
c++ committee resources to (re)design it added nothing to C++.
Having standard solutions to certain library problems is useful and
I don't think it's necessary
And that cannot be achieved by using arbitrary external libraries.
The trickiest functions: networking, linear algebra, gui, serialization,
are all currently handled by non-iso libraries. I don't see the value
of the C++ committee trying to contribute to that.
Take the STL iterator model. It is baked into the language,
it's baked into the STL library
thanks to range-based `for`.
Hmmm I'm not seeing that. But maybe you're right. Before I had
range-based 'for', I used BOOST_FOREACH for the same effect. I'm not
totally convinced that it had to "baked into the language". If so, it
raises questions about the design of the feature.
You couldn't do that with a library external model.
And you might have multiple competing iterator models. Some might look
like STL iterators, some might look like Java's "iterator", etc. And
they'd be incompatible.
Could be. But I doubt it. The most successful libraries become dominant
and tend to drive out alternatives... Until some compelling better
alternative comes along. This is a general principle which shows in
evolution, capitalism, art, politics and other fields which are not
And while one model might be better in some situations than another, the
fact is picking /some answer/ is better than "let the community sort it
out". Because the C++ community has proven that they are really terrible
Hmmm - Did you write that or did I. I don't remember.
If move support exists, only a short time will pass before library
writers take advantage of it. same for the other features.
Maybe. Maybe not.
Tell me: how long did it take `boost::shared_ptr` to get move support?
What about `boost::optional`? Or `boost::any`? And those are the ones I
know off the top of my head. Even now in 2017, is move support
ubiquitous throughout Boost? Even in the libraries that aren't well
To be honest I don't know. I certainly haven't heard anyone complain.
But the real answer is that if there exist superior components to the
boost ones, the boost ones will be driven out when that becomes apparent.
And what about when a language feature strongly encourages a library
redesign? Consider `boost::variant`. Move support is a game changer for
`variant` assignment, as it reduces the circumstances when a `variant`
might need to allocate memory to avoid thrown exceptions. At that point,
the decision to allow memory allocation /at all/ starts becoming
another interesting example. I know there has been a lot of dispute
about the various ways variant can / should be implemented. This is
chiefly due to the implementation of the assign operation in the
presence of multi-threading, exceptions, etc. I'm guessing that
different situations will call for different implementations and type
requirements. So trying to standardize this so that everyone or most
people are satisfied is not great idea. (FWIW - I suggested just
deleting the assignment as it works well for my use cases - Response was
So no, it takes more than "a short time" for such changes to appear.
As for "standards for library development", this idea suggests
ISO C++ committee has more power than it actually does. It can no
enforce "standards for library development" than it can make
respect that `class` and `struct` are not supposed to be
different. And without any genuine enforcement, such a "standard" is
Of course I'm aware of this. ButI've never meant to suggest that this
be "enforced". This is why I included other mechanisms to achieve
improvements of higher standards.
People use libraries because they're useful, not because they follow
some "standard for development".
This is my point exactly. Useful libraries have no need to part of
C++ standard library. Today we have many libraries that are widely
used and are not part of the standard. Eigen, Boost, multiple
serialization libraries, ASIO networking, CGal and many, many, many
more. Being part of the standard wouldn't make these libraries better
and not being part of the standard has not hindered their success.
Being part of the standard would make them /part of the standard/ and
thus reliable for all C++ users.
"standardization" and reliability are separate issues.
That matters in part because of the
difficulty of just using someone else's library,
"standard libraries" are not guarenteed to be easy to use. and many are not.
but also because of
I'm not sure what interoperability means in this context. I doubt it's
This is a problem only because A and B implemented two types that do the
same thing. If they'd just used `std::any`, this wouldn't be a problem.
But if you had your way, there would be no `std::any`, so you'd have two
Balkanized libraries that only work with themselves and code written
specifically against them.
Right. This is the essence of the argument for the current system. I
don't dispute that the argument has merit. But I don't think that this
is the only consideration. The current system also has a lot of
problems which are widely noted. Something is going to have to change.
My view is that the only thing that will make a difference will be to
narrow the scope of what the committee has to deal with. I think a lot
of people would agree with this. At least until it has to be decide
what should be eliminated from that scope.
The thing is, we both agree that there are domains the committee should
not be exploring. It's simply a matter of where that line gets drawn. To
you, that line should be "OS features" and that's it. To me, my concern
is primarily about having platform neutrality and general code
Hmmm ... I don't see much difference between what I mean when I say OS
features and you say "platform neutrality".
You received this message because you are subscribed to the Google Groups "ISO C++ Standard - Future Proposals" group.
To unsubscribe from this group and stop receiving emails from it, send an email to email@example.com.
To post to this group, send email to firstname.lastname@example.org.
To view this discussion on the web visit https://groups.google.com/a/isocpp.org/d/msgid/std-proposals/peva40%242lh%241%40blaine.gmane.org.