NCL Website header

NCL > What's New

What's new in the current release

[previous releases | next release]

Version 6.4.0 - February 28, 2017 - [download]

This release is accompanied by a new version (V1.1) of the NCL User Guide.


New functions

  • Array Create, Manipulator, Query

  • Bootstrap functions - Application of the bootstrap method.

    • bootstrap_correl - Bootstrap estimates of sample cross correlations (ie, Pearson's correlation coefficient) between two variables.

    • bootstrap_diff - Bootstrap mean differences from two samples.

    • bootstrap_estimate -Extract the user specified element from the bootstrapped values.

    • bootstrap_regcoef - Bootstrap estimates of the linear regression coefficient.

    • bootstrap_stat - Bootstrap estimates of basic statistics derived from a variable.

  • CESM

    • albedo_ccm - Calculates the albedo given a pair of model radiation variables.

    • time_reassign - Reassign (replace) a Cf-conforming "time" variable by calculating the mid-time values using the "bounds" attribute.

    • time_reassign_cv2var - Reassign (replace) a Cf-conforming "time" coordinate variable associated with a variable by calculating the mid-time values using the "bounds" attribute.

  • Crop & Evapotranspiration - A suite of functions for computing crop water requirements based upon the FAO Irrigation and drainage paper 56 including Penman-Monteith estimates (FAO 56) of reference evapotranspiration.

  • Date, String

    • cd_inv_string - Converts string time values to numeric values, using the given format string.

    • cla_sq - Create a string that uses single quotes (') to enclose command line assignment statements (CLAs) for later use by NCL's system procedure.

  • Extreme Value Statistics - A small suite of functions focused on extreme value distributions.

    • extval_frechet - Calculates the probability (PDF) and cumulative (CDF) distribution functions of the Frechet Type II distribution given the shape, scale and location parameters.

    • extval_gev - Calculates the probability (PDF) and cumulative (CDF) distribution functions of the of the Generalized Extreme Value (GEV) distribution given the shape, scale and location parameters.

    • extval_gumbel - Calculates the probability (PDF) and cumulative (CDF) distribution functions of the Gumbel (Type I) distribution function given the scale and location parameters.

    • extval_mlegam - Estimates the shape, scale, location other parameters for the Gamma distribution using Maximum-Likelihood Estimation.

    • extval_mlegev - Estimates the shape, scale and location parameters for the Generalized Extreme-Value (GEV) distribution using Maximum-Likelihood Estimation.

    • extval_pareto - Calculates the probability (PDF) and cumulative (CDF) distribution functions of the Pareto distributions (Generalized, Type I, TYpe II) given the shape, scale and location parameters.

    • extval_recurrence_table - Calculates the recurrence interval (return period), cumulative and exceedence probabilities based upon a time series.

    • extval_return_period - Calculates the period of an event (eg, flood, heat wave, drought) occurring given an average event recurrence interval and specified probability level.

    • extval_return_prob - Calculates the probability of an event (eg, flood, heat wave, drought) given an average event interval and a specified exceedance period.

    • extval_weibull - Calculates the probability (PDF) and cumulative (CDF) distribution functions of the Weibull Type III distribution given the shape, scale and location parameters.

  • General Applied Math

    • calculate_daily_values - Calculate daily values [avg, sum, min, max] from high frequency temporal values.

    • calculate_segment_values - Calculate segment (eg, pentad [5-day], weekly [7-day]) values from high frequency temporal values.

    • ceemdan - Complete ensemble empirical mode decomposition with adaptive noise.

    • cohsq_c2p - Given coherence-squared and the effective degrees-of-freedom, calculate the associated probability.

    • cohsq_p2c - Calculate the value(s) of coherence-squared required for a specified significance level and effective degrees-of-freedom.

    • demod_cmplx - Perform a complex demodulation on one or more time series.

    • eemd - Perform ensemble empirical mode decomposition.

    • eofunc_n / eofunc_ts_n / eofunc_n_Wrap / eofunc_ts_n_Wrap / eof2data_n

      These are identical to eofunc / eofunc_ts / eofunc_Wrap / eofunc_ts_Wrap / eof2data, except you no longer need to reorder the input array so that 'time' is the rightmost dimension.

    • get_d2r - Return a constant that converts degrees to radians.

    • get_pi - Return pi as a type float or double.

    • get_r2d - Return a constant that converts radians to degrees.

  • Heat-Stress functions

    • fire_index_haines - Calculates the Haines fire index for a sounding.

    • heat_apptemp - Compute apparent temperature.

    • heat_discoi - Compute a simplified human discomfort index.

    • heat_discoi_stull - Compute the human discomfort index due to excessive heat and humidity using the Stull wet bulb temperature (wetbulb_stull).

    • heat_esidx_moran - Compute an environmental stress index (ESI) which is an alternative to the wet bulb globe temperature (WBGT).

    • heat_humidex - Compute the 'feels-like' temperature for humans.

    • heat_index_nws - Computes the 'heat index' as calculated by the National Weather Service.

    • heat_swamp_cooleff - Compute the swamp cooler temperatures at 65% amd 80% efficiency.

    • heat_thic_thip - Compute the thermal humidity comfort index (thic) and the thermal humidity physiology index (thip).

    • heat_wbgt_inout - Compute the composite Wet-Bulb Globe Temperature (WBGT) index with options for indoor or outdoor formulations.

    • heat_wbgt_simplified - Simplified WBGT index.

    • wetbulb_stull - Calculate wet bulb temperature at standard sea level pressure (1013.25 hPa) using the method of R.Stull.

  • Legends

    • simple_legend - Creates a legend based on user supplied resources. simple_legend gives the user complete control over the design and placement of the legend, and can be seen as an easier way to create legends in NCL. See examples leg_16.ncl and leg_17.ncl.

  • Meteorology

    • brunt_vaisala_atm - Compute the Brunt-Vaisala frequency which is a measure of bouyancy in a continuously stratified atmosphere.

    • coriolis_param - Compute Coriolis parameter.

    • eady_growth_rate - Compute the Eady maximum baroclinic growth rate.

    • epflux - Compute quasi-geostrophic Eliassen-Palm fluxes at isobaric levels.

    • grad_latlon_cfd - Compute the meridional and zonal gradients of a variable on a global or limited area rectilinear grid.

    • latent_heat_water - Estimate latent heat flux for water: evaporization (condensation), melting (freezing) or sublimation (deposition).

    • pot_temp_equiv - Compute equivalent potential temperature..

    • pres_hybrid_jra55 - Calculates the "full" hybrid levels for the 60-level Japanese ReAnalysis.

    • relhum_ice / relhum_water - Calculates relative humidity with respect to ice/water, given temperature, mixing ratio, and pressure.

    • rigrad_bruntv_atm - Compute the atmospheric gradient Richardson number and, optionally, the Brunt-Vaisala, buoyancy and shear.

    • satvpr_water_bolton - Estimate the saturation vapor pressure over water using Bolton's equation 10.

    • satvpr_water_stipanuk - Estimate the saturation vapor pressure over water using the Stipanuk approximation.

    • wetbulb - Compute wetbulb temperature.

    • wgt_vertical_n - Calculate a weighted vertical average and/or sum (integral).

    • wind_speed - Calculate wind speed from zonal and meridional wind components and return associated meta data.

    • wind_stats - Given a sequence of wind speeds and directions, compute assorted wind-related statistics including the standard deviation of the wind direction.


New features

Block style comments

You can now do "block style" comments in NCL, using /; and ;/ to start and end a block:

/;
    This is inside an NCL block style comment,
    available in NCL V6.4.0.
;/

  print("A demonstration of block comments")

/;
  print("This line should not be printed")
  print("...nor this one")
  print("...or this one")
 ;/

NCL User Guide examples and data files included

All 100+ NCL User Guide (NUG) tutorial scripts and many of the required data files are included with NCL V6.4.0.

Look in the directories:

$NCARG_ROOT/lib/ncarg/nclex/nug
$NCARG_ROOT/lib/ncarg/data/nug

or use the "ng4ex" command to generate the full list or run any one of them individually:

  ng4ex -nug -list
  ng4ex NUG_curvilinear_basic

A complete set of scripts and data files can also be found online.

Labelbars with triangle ends

You can now generate labelbars with triangle ends instead of rectangle ends. See the lbBoxEndCapStyle resource mentioned in the new resources section, or examples lb_16.ncl, lb_17.ncl, gpm_1.ncl, and corel_3.ncl on the labelbar examples page.

Preloaded NCL scripts

Several NCL scripts are now preloaded, and you no longer need to explicitly load them at the top of your own scripts. Here's a full list of the preloaded scripts:

    "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_code.ncl"
    "$NCARG_ROOT/lib/ncarg/nclscripts/csm/gsn_csm.ncl"
    "$NCARG_ROOT/lib/ncarg/nclscripts/csm/contributed.ncl"
    "$NCARG_ROOT/lib/ncarg/nclscripts/csm/shea_util.ncl"
    "$NCARG_ROOT/lib/ncarg/nclscripts/csm/bootstrap.ncl"
    "$NCARG_ROOT/lib/ncarg/nclscripts/csm/extval.ncl"
    "$NCARG_ROOT/lib/ncarg/nclscripts/wrf/WRFUserARW.ncl"

It's okay to still load these scripts even though it's no longer required.

In most cases you will not notice but loading these scripts does take a few tenths of a second. In certain situations where you want to run many very short scripts the resulting overhead may cause a significant slowdown of your work flow. For this case, a new command line option, '-s', allows you to start NCL without preloading any scripts.


Backwards-incompatible changes

The "MediumRes" map database will now be the default in some cases

This involves a change to the default value of the mpDataBaseVersion resource

This resource has a new default value of "Dynamic". This means that the "LowRes" database (the default in older versions of NCL) will be used, unless either of the boundary-set resources mpFillBoundarySets or mpOutlineBoundarySets are set to anything other than "Geophysical", in which case the "MediumRes" database will be used.

This behavior represents a balance between performance and accuracy for common use cases. For global-scale maps depicting just the continents and oceans, the LowRes database is more than sufficient, whereas the more detailed MediumRes database would incur additional compute overhead. However, the political boundaries of the MediumRes database are more up-to-date, and are thus favored if any political boundaries are to be drawn.

Click on the images below for a comparison with the previous version of NCL, and note the differences in the righmost panel. The left plot of both versions shows the default "LowRes" database. The right panel of NCL V6.3.0 still shows the "LowRes" database, while the right plot of NCL V6.4.0 automatically defaulted to "MediumRes"

Labelbar colors will now reflect opacity used in plot

The labelbar associated with filled contour and color vector plots will now reflect any opacity applied.

If this behavior is not desired, then set the new resource lbOverrideFillOpacity to True.

Click on the images below for a comparison with the previous version of NCL, and note the differences in the bottom labelbar.


Updated functions

  • betainc / gammainc - These functions no longer require that the input values be float or double.

  • calculate_monthly_values - This function no longer requires that the time dimension name be 'time'. Also, (i) the algorithm no longer assumes that the values are equally spaced in time; and (ii) the user may specify that a minimum number of observation be used for deriving a statistic.

  • cd_inv_calendar / ut_inv_calendar - These functions were enhanced to allow different numeric types for the input parameters and the return array. The return array will still default to type "double" unless the special attribute "return_type" is set to "float", "long", or "int". Read the function documentation for more details.

  • conform / conform_dims - These functions have been updated to more easily work on arrays with degenerate dimensions or arrays that are already the size you want to conform them to. This should simplify code where you may have had to check the rank of a variable before conforming it to another variable.

  • gsn_csm_xy - This function underwent a major overhaul to allow for horizontal curves and bars to be filled when gsnXRefLine is set. This is the same behavior that already existed for gsnYRefLine. See the bar chart page for several horizontal bar examples.

  • printMinMax - The "units" attribute, if present, is now added to the printed output.

  • regCoef / regCoef_n - These functions now return the y-intercept.

  • regline_stats - Added several new attributes related to 95% confidence limits of the slope and y-intercept, the mean responses, and the prediction limits. Examples of use were added to the regress/trend page. See: regress_1a.ncl, regress_1b.ncl, regress_1c.ncl and regress_3.ncl.

  • skewt_func - Added ability to change the thickness of the wind barbs.

  • trend_manken - Added recognition of the opt argument: if opt is set to True, then the trend is not calculated.

  • wrf_user_vert_interp - Added recognition of a new opt@time attribute that allows you to specify which time index was used to extract the variables for interpolation.


New resources


New example pages


New color table

The new default color table for matplotlib, MPL_viridis, was added, thanks to Brian Medeiros of NCAR.



File I/O Improvements

  • The "advanced" file system that supports groups and advanced data types is now the default for HDF5 and NetCDF4 data. The old HDF5 module that uses the standard file system is now deprecated and its use is not advised. HDFEOS5 still defaults to the standard file system (where groups names are appended to variable names to make them unique) because there are some remaining issues with the advanced version. The standard file system is also the default for NetCDF3 and NetCDF4 Classic Model files. While the setfileoption "FileStructure" option can still be used to explicitly switch between the "advanced" and "standard" file systems for certain formats (such as NetCDF3), it is now mostly unnecessary and it is anticipated that this option will soon become obsolete.

  • Support was added for CDF5, a NetCDF3 model variant developed at Argonne National Laboratory, that supports very large variables. Although CDF5 has a NetCDF3-like interface, NCL needs to use the advanced file system to access data in this format, primarily because it supports additional atomic types like unsigned and 64-bit integers.

  • The syntax for working with groups and variables within groups has been regularized as follows:
    • A variable in any group may be read from the top level without further qualification using the syntax:
      v = f->variable_name.
      If more than one variable with the same name exists in the file, the first one encountered will be returned. The search order is the same as the order returned by getfilevarnames.
    • A slash character ('/') is required as the first character naming a variable in a lower group:
      v = f->/group_name/variable_name
      This is needed to for backwards compatibility to disambiguate a reference to a variable in a file from a variable in a file divided (numerically) by a local variable (because NCL has never required spaces separating operators from the operands). The slash that follows directly after the "->" operator indicates that any other slashes prior to the next following space are separators between groups or between a group and a variable. This means that you do need a space prior to a slash meant for numerical division when a variable is specified using the initial slash character.
    • On the other hand, an initial slash is optional for referencing a group:
      g = f=>group1/group2
      works as well as
      g = f=>/group1/group2
    • If group or variable names contain characters, such as punctuation or white space characters, that are not allowed for NCL symbol names, the user can reference the group or variable using NCL string references or by substituting underscrore '_' characters in place of each character that is not allowed. Of course, it is possible that this substitution would lead to ambiguities. In that case, as with variables specified without fully qualifying paths, the first encountered group or variable will be returned.
    • NCL printing routines for the advanced file system always print the real names of variables with the actual characters used in the file.

  • Compound data can now be accessed either by individual component member or as a complete variable. Accessing by component relies on the dot ('.') character to separate the compound variable name from the member name. If using string referencing because either the variable name or the member contains disallowed characters, the complete name (everything following the "->" operator) should be quoted.

    Accessing a complete compound variable returns a varible of type list to NCL. Each individual component member becomes a list element. Outer dimensions (those applying to the compound variable as a whole) and inner dimensions (those applying to individual members) are combined to form the dimensionality of each element of the list.

  • Both ncl_convert2nc and ncl_filedump have been enhanced to accept filenames with spaces as parameters on the command line.

  • NCL can now read certain ECMWF GRIB1 files with records whose size is greater than 256^3, the largest size that can be specified in the conventional manner in a GRIB1 file. This was made possible due to some code borrowed from the wgrib tool. This code comes from ECMWF and uses an undocumented extension to allow for larger record sizes.

  • The HDF5 reference type is now supported at a basic level. HDF5 references are pointers to other variables in the same file. If an HDF5 reference is assigned to an NCL variable it is converted to the NCL string type. The string value will be the full path of the variable that is referenced.

  • Support for multiple unlimited dimensions in NetCDF4 and HDF5 has been improved.

  • HDF5 dimensions that are unnamed by HDF5 have a more robust naming scheme.

  • The NCEP GRIB1 and GRIB2 tables have been updated fairly recently.

  • NCL can now ususally figure out the format of files that have no extension. The only remaining confusion is distinguishing between NetCDF4 files and HDF5 files that are not NetCDF4.

  • To enhance performance, the caching of unlimited variable values was eliminated. Testing has shown that this has more or less effect depending on the file format and the size of the unlimited dimension, but it never caused a slow-down.

File I/O bugs fixed

  • Fixed a major bug in the GRIB1 code that resulted in some elements of the non-horizontal dimensions of a variable being presented in the wrong location in NCL's representation. This occurred in certain cases where one or more elements at the beginning of the affected dimension were missing. The code that sorts the records into their proper position was completely rewritten to fix this issue. An intended future benefit of this rewrite is to make it easier to add new dimensions as needed.

  • Fixed a bug where you couldn't read a NetCDF-4 variable that contains a '.' character; for example, "conc.instant_Cs137".

  • Fixed a bug where HDF5 strings in nested groups were not accessible.

  • Fixed a serious bug where opening an HDF5 file in "w" or "c" mode resulted in the original file being deleted. Now opening an HDF5 file works as expected in NCL.

  • Fixed a bug in which opening some HDF5 files with "new" file features (like groups, compound data, enum, etc) would cause a segmentation fault.

    A work-around for V6.3.0 is to call the following before you open the file:

      setfileoption("h5", "FileStructure", "Advanced") 
    

    or to use the "-f" option when running ncl:

      ncl -f myscript.ncl
    

  • Fixed a bug where deleting a group would delete parts of the file variable structure.

  • Fixed a bug that led to a segmentation fault when trying to read a variable from a NetCDF4 file opened in "w" mode.

  • Fixed a bug that resulted in a segmentation fault when re-writing string attributes to a NetCDF4 file, even if the attribute values were unmodified from their previous values in the file.

  • Fixed a segmentation fault that occurred when using fileattdef to create global attributes in a NetCDF4 file.

  • Fixed a problem that resulted in NCL reading a variable from the wrong group. Both variables had the same name, but the group traversing code was incorrect.

  • Fixed a problem with printing an HDF variable len variable that contained reference type variables.

  • Fixed a problem where scalar variables were given an improper dimension size and the dimension improperly labeled "unlimited" when first created in a NetCDF4 file.

  • Eliminated unnecessary warnings for the GRIB "curvilinear orthogonal grid".

  • Fixed a problem with JRA GRIB1 files where the names of certain variables was no longer correct because access to some local-to-JRA information was lost due to a faulty piece of code.

  • Fixed the HDF5 code to allow access to all the possible string types including variable length strings, fixed-size NULL byte filled (C-style), and fixed-size space filled (Fortran-style). This was implemented both for attributes and regular variables.

  • Fixed an issue where using the NCL dimension name operator (!) was returning the wrong dimension name for NetCDF4 files.

  • Fixed a seg fault with GRIB files that have what may or may not be a bogus negative time period. The time period is definitely negative. Whether or not this is bogus is not yet determined.

  • Fixed a seg fault that occurred when trying to access a non-existent variable attribute in a NetCDF4 file.

  • Fixed a problem with incorrect forecast times for certain ARPEGE GRIB2 files.

  • Fixed a problem using the list style delete syntax for deleting file variables and coordinate variables.

  • Added some GRIB 2 tables to allow decoding of non-standard MRMS data from NCEP.

  • addfile - Fixed a bug that limited file names to less than 256 characters.

  • addfiles - Fixed a bug with arbitrary subscripting of an aggregated dimension when the time units vary.

    Note that the time coordinate units attached to the first file in the file list is always used as the common unit. If other files in the list have earlier times than the common unit, they will be converted to negative values. They will still be correct (the correct date string can be obtained using the usual routines), but if the files in the list are not in chronological order, the following warning will be emitted:

    warning:Aggregated dimension coordinate values are non-monotonic; check aggregated file ordering
    

    Also fixed a subsequent issue with the coordinate values being calculated incorrectly and an issue where the aggregated dimension has no associated coordinate array.

  • fbinread / fbinwrite / fbindirread / fbindirwrite - Fixed a bug where these functions would fail if you tried to write more than 2 GB to a file.

  • write_table - Fixed a bug that didn't allow you to write logicals to a file.

Functions and procedures bugs fixed

  • advect_var - This function was incorrectly named in NCL V6.3.0. The correct name is advect_variable.

  • Fixed an issue with math functions like stddev where if a "ubyte" type was used, the _FillValue was ignored.

  • area_conserve_remap - Fixed a bug where if missing values were input for a 2D grid, then the output grid was not returning all missing values.

  • decimalPlaces - Fixed a bug to make this function round correctly.

  • dim_acumrun_n - Fixed a bug where this function was returning float output, even if integers were input. This function will now return the type that matches the type of the input array, except in the case of bytes or shorts, then integers will be returned.

  • dpres_plevel - Fixed a bug for a pathological case where if psfc and ptop are between levels, the function would fail.

  • ESMF_regrid - Fixed a bug where this function would sometimes produce an incorrect error about not having the correct number of triangles for the internal triangular mesh that it generates.

  • f2foshv_Wrap / fo2fshv_Wrap - Removed incorrect syntax that subtracted or added one to the size of the latitude arrays of uNew and vNew and prevented the functions from working at all.

  • gc_latlon - This function now uses a more general algorithm to hopefully prevent NaNs from being returned when input points are very close to each other.

  • generate_sample_indices - replaced the method used with a similar but more robust method.

  • int2p, int2p_n - Fixed a bug which occurred when the input and output pressure levels were not monotonically decreasing. The functions now work regardless of the input or output pressure level ordering.

  • kolsm2_n - Fixed a bug in which using multiple dimensions (for example, in the time dimension) would cause the results to be wrong.

  • pot_vort_isobaric - Andy Show (12/15) reported that the term [ R/(s*p) ] should be [ 1/s ].

  • rm_single_dims - Fixed a bug where dimension names of non coordinate variables would get removed from the output array.

  • stat_dispersion - Fixed a bug where setting the opt=True but not associating any attributes resulted in a fatal error.

  • table_attach_rows - Removed debugging print statements.

  • thornthwaite - This function will now exit more gracefully if time is not a multiple of 12. It will also allow a scalar time value.

  • trend_manken - Fixed bug in which passing a really large array (i.e. 400 MB) would cause this routine to hang and/or produce a segmentation fault. The issue was with the internal sorting algorithm used, which is required to calculate the trend. A new sorting algorithm was implemented, and also the option to not calculate the trend was added.

  • triple2grid - Fixed a bug where the output results were sometimes shifted by one pixel in both directions. The value for "domain" now defaults to 1 instead of 0.

  • wrf_map_resources - Fixed a bug in which rotated projection parameters were not always set correctly.

  • wrf_user_getvar - Fixed the following bugs:

    • "cape_2d" and "cape_3d" - Fixed a bug in which the PSFC variable was not being converted to hPa as required by these routines.

    • "ctt" (cloud top temperature)

      • This function now returns its values in degrees Celsius (it was previously returning them in degrees Kelvin). This is to stay consistent with other diagnostics that are returned in Celsius.
      • Fixed a bug where the pressure values were not being converted properly.
      • Fixed the interface to allow "ctt" to be calculated for multiple timesteps when used with addfiles.

    • "helicity" - Fixed the interface to allow helicity to be calculated for multiple timesteps.

    • "omg" (omega) - Fixed a bug in which the calculation was incorrectly using the temperature values for the water vapor mixing ratio, making the output results nonsensical.

Date conversion routines bugs fixed

  • calendar_decode2 - This function has been updated to call cd_calendar instead of ut_calendar. This helps address some issues found with the Udunits code, like the infamous 60 second bug mentioned in this document. There are still occasions where the 60 second bug may occur. If it does, try the temporary calendar_decode2_fix function.

  • cd_convert - This function was fixed to recognize the "calendar" attribute.

  • day_of_year / days_in_month / day_of_week / isleapyear - These functions now correctly set the _FillValue attribute if there are potentially missing values returned.

  • time_axis_labels - Fixed a bug where this function wasn't recognizing the "calendar" attribute.

  • New temporary date conversion routines introduced to address "60 second" bug

    Many users reported a "60 second" bug in several of NCL's date conversion routines, in which you get a value of "n minutes, 60 seconds" instead of "n+1 minutes, 0 seconds". See details in the example scripts below.

    In order to address this problem, several new date conversion routines have been temporarily introduced in NCL V6.4.0 to see if they are acceptable replacements for their original counterparts:

    These functions are all based on the Calcalcs, a software package created by David W. Pierce of UCSD, who is also the author of Ncview. An important note is that this software treats year 0 and year 1 separately.

    Here's a script that illustrates the "60 second" bug:

      load "$NCARG_ROOT/lib/ncarg/nclscripts/contrib/calendar_decode2.ncl"
    
      time = 300
      time@units = "seconds since 2013-11-01 00:00:0.0"
      utx = ut_calendar(time,-5)
      cdx = cd_calendar(time,-5)
      cd2x = calendar_decode2(time,-5)
      print("time:             " + time)
      print("units:            " + time@units)
      print("ut_calendar:      " + str_join(""+utx,",")) 
      print("cd_calendar:      " + str_join(""+cdx,","))              
      print("calendar_decode2: "  + str_join(""+cd2x,","))              
    
      time = 876595
      time@units = "hours since 1901-01-01 00:00:00"
      time@calendar = "gregorian"
      utx := ut_calendar(time,-5)
      cdx := cd_calendar(time,-5)
      cd2x := calendar_decode2(time,-5)
      print("time:             " + time)
      print("units:            " + time@units)
      print("ut_calendar:      " + str_join(""+utx,",")) 
      print("cd_calendar:      " + str_join(""+cdx,","))              
      print("calendar_decode2: "  + str_join(""+cd2x,","))              
    
    The minutes/second (and sometimes hour) values are inconsistent across three similar routines. The blue text indicates the expected values, while the red text indicates the bug:

        time:             300
        units:            seconds since 2013-11-01 00:00:0.0
        ut_calendar:      2013,11,1,0,5,0
        cd_calendar:      2013,11,1,0,4,60
        calendar_decode2: 2013,11,1,0,4,60
    
        time:             876595
        units:            hours since 1901-01-01 00:00:00
        ut_calendar:      2000,12,31,18,59,60
        cd_calendar:      2000,12,31,19,0,0
        calendar_decode2: 2000,12,31,19,59,60
    
    The results aren't necessarily wrong, but they are not consistent. In general, the preference is to never get the value 60 for seconds.

    Here's a short script contributed by Jared Lee, that further illustrates the inconsistency using the cd_string and ut_string functions:

      load "$NCARG_ROOT/lib/ncarg/nclscripts/contrib/cd_string.ncl"
      load "$NCARG_ROOT/lib/ncarg/nclscripts/contrib/ut_string.ncl"
    
      do testSec = 0, 360, 60
         testSec@units = "seconds since 1970-01-01 00:00:00"
         print("testSec (cd_string) = "+testSec+" -> "+cd_string(testSec, "%N/%D/%Y, %H:%M:%S"))
         print("testSec (ut_string) = "+testSec+" -> "+ut_string(testSec, "%N/%D/%Y, %H:%M:%S"))
      end do
    

    The inconsistent values are noted in red:

         testSec (cd_string) =   0 -> 01/01/1970, 00:00:0
         testSec (ut_string) =   0 -> 01/01/1970, 00:00:00
         testSec (cd_string) =  60 -> 01/01/1970, 00:00:60
         testSec (ut_string) =  60 -> 01/01/1970, 00:01:00
         testSec (cd_string) = 120 -> 01/01/1970, 00:02:00
         testSec (ut_string) = 120 -> 01/01/1970, 00:02:00
         testSec (cd_string) = 180 -> 01/01/1970, 00:03:00
         testSec (ut_string) = 180 -> 01/01/1970, 00:03:00
         testSec (cd_string) = 240 -> 01/01/1970, 00:03:60
         testSec (ut_string) = 240 -> 01/01/1970, 00:04:00
         testSec (cd_string) = 300 -> 01/01/1970, 00:04:60
         testSec (ut_string) = 300 -> 01/01/1970, 00:05:00
         testSec (cd_string) = 360 -> 01/01/1970, 00:06:00
         testSec (ut_string) = 360 -> 01/01/1970, 00:06:00
    

    In an attempt to fix this problem, we searched for other calendar software packages, and found one called "Calcalcs", created by David W. Pierce of UCSD, who is also the author of Ncview. This software fixes the "60 second" bug, but it doesn't treat year 0 and year 1 as the same.

    Instead of replacing the buggy code in the cd_xxxx and ut_xxxx routines, we decided to create temporary new routines that use Calcalcs under the hood. We want users to get the chance to try them out before we make any decisions about what to do with all of the calendaring routines.

Graphics bugs fixed

  • The labelbar associated with filled contour and color vector plots will now reflect any transparency applied. See the section on Backwards-incompatible changes in 6.4.0 for more information.

  • Improved some issues with triangular meshes, which are used under the hood for contouring unstructured data. This included fixes for using triangular meshes with WRF and ICON data.

  • Fixed some issues where contour fill plots were missing small fill areas around the edges of the plot. These were most likely to occur when an axis was reversed and/or a logarithmic or irregular transformation was in effect.

  • Fixed an issue that resulting in a seg fault when using the "CurlyVector" mode for plotting vectors. This seg fault did not impact the graphics output but was annoying and could prevent continued work in the same script or interactive session.

  • Fixed a seg fault involving CellFill contouring using the TriangularGrid method.

  • Fixed a problem that led in some cases to the left half of a CylindricalEquidistant contour plot being left blank. The issue involved the boundary at the cyclic point of the map.

  • Fixed a bug where CellFill contouring wasn't recognizing the cnMissingValFillColor resource. This was reported in relation to plotting ICON data.

  • gsn_attach_plots - Fixed a bug where attaching plots along the X axis, when the Y axis labels weree of different sizes, the plots would be misaligned.

  • gsn_create_text - Fixed a bug which didn't allow for multiple text strings to be input, as advertised.

  • gsn_csm_pres_hgt_vector - Fixed a bug where this function didn't recognize pmLabelBarWidthF or pmLabelBarHeightF settings for controlling the size of the labelbar.

  • gsn_csm_vector_scalar_map / gsn_csm_vector_scalar_map / gsn_csm_vector_scalar_map_ce / gsn_csm_vector_scalar_map_polar - Fixed a bug in which you'd get a warning:

    warning:vcRefAnnoFontHeightF is not a valid resource in 
             gsn_csm_vector_scalar_map_contour at this time
    

    and the vector reference annotation may have been larger than expected.

  • gsn_open_wks - Fixed a bug where NCL would fail if you used really long names (> 320 characters) for output graphics filenames.

  • gsn_panel - This function now allows you to set more resources to customize the panel labelbar, like lbLabelAlignment.

  • gsnYRefLine / gsnAboveYRefLineColor / gsnBelowYRefLineColor - Fixed a bug that didn't allow you to use a mix of scalar values or arrays for these resoures, if you had multiple curves.

  • pie_chart - Fixed a bug that caused an unnecessary fatal error:

    fatal:fspan: number of elements parameter is less-than-or-equal-to one, can't continue
    

Miscellaneous bugs fixed

  • Fixed a catastrophic failure that occurred if the "shea_util.ncl" script was loaded multiple times from the same script.

    Many users of ESMValTool reported not being able to use NCL V6.3.0 because it would seg fault. This was caused by a missing undef statement in the "shea_util.ncl" library for the "mreg_part_corr" function. This bug has been fixed and we've taken extra steps to make sure all the other loaded NCL scripts have "undef" statements before every function and procedure.

  • Fixed an issue where interactive NCL would not resume after a <ctrl>-z

    On Mac systems, if you ran ncl interactively and then typed <ctrl>-z, you couldn't resume ncl by typing "fg". This has been fixed.

  • Fixed a seg fault that occurred when you tried to use basic NCL syntax on plot variables.

    For example:

      wks = gsn_open_wks("x11","") ;; tested png and x11 both fail
      plots = new(5, graphic)
      do p=0,4
        plots(p) = gsn_csm_blank_plot(wks, res)
      end do
      plots = plots(::-1) ; seg fault would occur here
    

  • Fixed an issue with NCL failing after a very great many variables had been created. The code for growing the relevant data structure was at fault.

  • Fixed an incorrect warning from the function isfilepresent.


Miscellaneous improvements

  • Plotting unstructured or "random" data

    When using one of the gsn_csm_xxxx_map functions to plot unstructured or "random" data represented by one-dimensional lat/lon arrays, you can now attach these arrays as special attributes called "lat1d" and "lon1d" attributes to your data array, instead of having to set the sfXArray and sfYArray resources. This is similar to how the "lat2d" and "lon2d" attributes work for curvilinear data.

    For some examples, see the Plotting data on a map examples page.

  • Plotting WRF data

    If you plot WRF-ARW data using the gsn_csm_xxx scripts, you will no longer get these warning messages:

      warning:start_lat is not a valid resource in wrf_gsn_contour at this time
      warning:start_lon is not a valid resource in wrf_gsn_contour at this time
      warning:end_lat is not a valid resource in wrf_gsn_contour at this time
      warning:end_lon is not a valid resource in wrf_gsn_contour at this time
      warning:mpNestTime is not a valid resource in map at this time
    

    Note: you can safely ignore these warnings with older versions of NCL.

    You can now use cnFillPalette with wrf_contour without getting the following error:

    warning:ContourPlotSetValues: color index (255) exceeds size of palette,
            defaulting to foreground color for entry (11) 
    

  • wrf_user_getvar

    Overhauled this code to improve the handling of multiple timesteps, either in one WRF file or across multiple files.

  • Overhauled special "gsn" color resources to allow RGB/A entries

    Many of the special"gsn" Color resources now allow all types of color entries, including color index values, named colors ("red"), RGB values, and RGBA values. Use of RGBA allows you to make filled areas partially transparent.

    The updated resources include:

  • Default X11 window now larger

    When sending NCL graphics to an X11 window, the default size is now 1000 x 1000 (it was 512 x 512).

  • Colorado counties improved

    The Colorado counties in NCL's map database were updated to include Broomfield and the correct boundaries for existing counties. See example maponly_28.ncl.

  • Handling of constant and near-constant value fill updated

    Traditionally NCL has not been able to contour areas that have constant value when using the AreaFill method since it depended on the presence of at least one virtual contour line to decide that there was anything to draw. The problem included not only true constant value data, but also data where the only the area mapped into the viewport was constant, or where the data was not constant but fell entirely within two adjacent specified contour levels. Prior versions of NCL have included a work-around fix that often, but certainly not always, worked to allow these areas to be drawn with the proper color. This fix was activated by setting the resource cnConstFEnableFill True.

    Now a new and hopefully more robust solution to this issue has been implemented. For now, cnConstFEnableFill still needs to be set True only in the case of truly constant data. The intention is that within one or two releases the default for this resource will change from False to True, so that plots of constant data are always filled.

    Note that there are a few cases where fill of constant data still does not work. These are rare but mostly encountered when using a TriangularMesh grid for contouring.

    See example coneff_17.ncl.