Acknowledgements (temporal order) ccnSim development started in the context of theANR Project Connect ...continued thanks to funding from the EIT ICT Labs project on Smart Ubiquitous Content ...continued thanks to funding from the EIT ICT Labs project on Virtual Data Plane for Sofware defined network ...and now continues thanks to Cisco's Chair NewNet@Paris People (alphabetical order) Andrea Araldo (former principal suspect) Raffele Chiocchetti (former developer) Emilio Leonardi (well informed outsider) Dario Rossi (occasional debugger) Giuseppe Rossini (former lead developer) Michele Tortelli (\infty power user) If you wish to contact us please use the mailing list ccnsim@listes.telecom-paristech.fr-- and refrain to send me emails in unicast, as my email loss probability is non zero and my email delay reply is anyway heavy-tailed

## Overview

ccnSim is a scalable chunk-level simulator of Information and Content Centric Networks (ICN/CCN) that we make available as open-source software to promote cross-comparison in the scientific community. ccnSim is written in C++ under the Omnet++ framework, and features two simulation engines.

• A classic-event driven engine (available in all versions) allows to assess CCN performance in scenarios with large orders of magnitude for CCN content stores (up to 10^6 chunks) and Internet catalog sizes (up to 10^8 files) on off-the-shelf hardware (i.e, a PC with a fair amount of RAM). If you use ccnSim up to v0.3, we ask you to please acknowledge our work by citing [ICC-13] (thanks!)
• ModelGraft, a new hybrid modeling/simulation engine (available starting from v0.4) that allows for unprecedented scalability: with respect to the (highly optimized) execution times of event driven simulation in v0.3, the new technique allow simulation of much larger networks catalogs and content stores on an exiguous amount of RAM and with over 100x reduction of simulation duration. If use ccnSim v0.4 or above, we ask you to please acknowledge our work by citing [COMNET-17a] (thanks!)

You can check the how fast the new version of ccnSim runs when equipped with the the new ModelGraft engine vs the classic event-driven engine on this YouTube video (that we demonstrated at [ICN-16]).

• We've just released the latest stable version of ccnSim-v0.4 as a docker image hosted on DockerHub!
• This will spare the hassle of compiling and setting up the environment and allow you to quickly launch your first ccnSim-0.4 simulation so rush to https://hub.docker.com/r/nonsns/ccnsim-0.4/ !
• Note: The container does not support the graphical interface. But, trust us, you do not need it anyway
• We've just committed the latest stable version of ccnSim-v0.4 on GitHub!!
• This is exactly the same ccnSim-0.4 version that you can dowload below, just hosted on GitHub
• The latest stable version of ccnSim is ccnSim-v0.4 !
• This new release does not change simulation API, but introduce new significant breakthrough [ITC28b][COMNET-17a][CCN-TR16] in the core simulation environment that allow for very significant memory reduction and CPU speedup.
• ccnSim-v0.4 is still able to run classic Event Driven simulation (as in v0.3, but with significant reduction of the memory footprint due to the use of inversion rejection sampling)
• ccnSim-v0.4 further allows to run a novel Monte Carlo simulation engine (new from v0.4, with significant reduction of the memory footprint and execution time)
• The Event Driven and Monte Carlo engines can be seamlessly used, so v0.3 manual is still valid
• While we update the 0.4 alpha, see below the FAQs for solution on the compatibility issues you may encounter with the new version 5.0 of omnet++
• As for the the former versions of ccnSim:
• Really, there is no reason not to use v0.4 which is not only more complete, but also simpler, more modular and faster!
• Download for v0.3 is still allowed, which features the Event Driven simulation engine only. Notice that v0.4 improves the Event Driven engine also, so you should consider moving it to this one (which, unless you have modified the code on your own, should be painless as they are fully interoperable)
• As of september 2015, I have disabled download of older versions (e.g., v0.1 was used in [CAMAD-12], [NOMEN-12], [ICN-12] and v0.2 in [COMCOM-13], [ICN-13], [ICC-13])
• We extensively benchmarked ccnSim performance and refactored its code, you can find an account of scalability properties of v0.3 in [ICC-13]
• Example scenarios to reproduce some of our latest publications [ICN-14a] and [ICN-14b] are also available
 Version Source Download count Manual Scenarios 0.4 (05/2017) ccnSim-0.4.tgz 111 v0.4 manual Additional scenarios and scripts: ModelGraft sensitivity 0.4alpha2 (02/2016) ccnSim-0.4alpha2.tgz 366 TBD, please have a look at [CCN-TR16] and v0.3 manual in the meanwhile Scenarios [CCN-TR16] included in the alpha 0.4alpha (12/2015) ccnsim-0.4alpha.tgz 229 TBD, please have a look at [CCN-TR16] and v0.3 manual in the meanwhile Scenarios [CCN-TR16] included in the alpha 0.3 (10/2014) ccnsim-0.3.tgz 1070 PDF Cost-Aware [ICN-14a], NRR [ICN-14b] 0.2 (09/2013) ccnsim-0.2.tgz 741 PDF ccnsim-inrr-scripts.tgz (548 downloads) 0.1 (03/2012) ccnsim-0.1.zip 1331 PDF -

## FAQ

• I have troubles installing ccnSim with omnet++ 5.1 (and above)
Unfortunately, this is due to changes in @opp_makemake@ (from the omnet++ changelog: "Support for deep includes (automatically adding each subfolder to the include path) has been dropped, due to being error-prone and having limited usefulness. In projects that used this feature, #include directives need to be updated to include the directory as well."). Fixing this issue is more involved though than just specifying the folders, since there are other modifications introduced by the 5.1 version of omnet++, among which send() and arrived() methods used to generate and process messages. Given that, aside these non-backward compatible changes, no change is relevant for ccnSim, we recommend you to use omnet++ 5.0
• I have troubles installing ccnSim with omnet++ 5.0