我最近正在使用一个DateTime对象,并写了这样的东西:
DateTime dt = DateTime.Now;
dt.AddDays(1);
return dt; // still today's date! WTF?
AddDays()的智能感知文档说它在日期后添加了一天,但它并没有这样做——它实际上返回了一个添加了一天的日期,所以你必须这样写:
DateTime dt = DateTime.Now;
dt = dt.AddDays(1);
return dt; // tomorrow's date
这个问题以前已经困扰过我很多次了,所以我认为将最糟糕的c#陷阱分类会很有用。
我来这个派对有点晚了,但我最近有两个问题都困扰着我:
DateTime决议
Ticks属性以千万分之一秒(100纳秒块)为单位测量时间,但是分辨率不是100纳秒,而是大约15毫秒。
这段代码:
long now = DateTime.Now.Ticks;
for (int i = 0; i < 10; i++)
{
System.Threading.Thread.Sleep(1);
Console.WriteLine(DateTime.Now.Ticks - now);
}
将给你一个输出(例如):
0
0
0
0
0
0
0
156254
156254
156254
类似地,如果查看DateTime.Now。毫秒,您将得到以15.625毫秒为单位的四舍五入块的值:15、31、46等等。
这种特殊的行为因系统而异,但是在这个日期/时间API中还有其他与解析相关的问题。
路径。结合
一种组合文件路径的好方法,但它并不总是按您期望的方式运行。
如果第二个参数以\字符开头,它不会给你一个完整的路径:
这段代码:
string prefix1 = "C:\\MyFolder\\MySubFolder";
string prefix2 = "C:\\MyFolder\\MySubFolder\\";
string suffix1 = "log\\";
string suffix2 = "\\log\\";
Console.WriteLine(Path.Combine(prefix1, suffix1));
Console.WriteLine(Path.Combine(prefix1, suffix2));
Console.WriteLine(Path.Combine(prefix2, suffix1));
Console.WriteLine(Path.Combine(prefix2, suffix2));
给出如下输出:
C:\MyFolder\MySubFolder\log\
\log\
C:\MyFolder\MySubFolder\log\
\log\
LinqToSql批处理速度随着批处理大小的平方而变慢
以下是我探索这个问题的问题(和答案)。
In a nutshell, if you try to build up too many objects in memory before calling DataContext.SubmitChanges(), you start experiencing sluggishness at a geometric rate. I have not confirmed 100% that this is the case, but it appears to me that the call to DataContext.GetChangeSet() causes the data context to perform an equivalence evaluation (.Equals()) on every single combination of 2 items in the change set, probably to make sure it's not double-inserting or causing other concurrency issues. Problem is that if you have very large batches, the number of comparisons increases proportionately with the square of n, i.e. (n^2+n)/2. 1,000 items in memory means over 500,000 comparisons... and that can take a heckuva long time.
为了避免这种情况,您必须确保对于预计有大量项目的任何批处理,您都在事务边界内完成整个工作,在创建每个项目时保存它,而不是在最后进行一次大保存。
合同在流。阅读是我见过很多人被绊倒的东西:
// Read 8 bytes and turn them into a ulong
byte[] data = new byte[8];
stream.Read(data, 0, 8); // <-- WRONG!
ulong data = BitConverter.ToUInt64(data);
这是错误的原因是流。Read最多读取指定的字节数,但完全可以只读取1个字节,即使在流结束前还有7个字节可用。
它看起来与Stream如此相似,这并没有什么帮助。如果没有异常返回,则保证已写入所有字节。上面的代码几乎一直都能工作,这也没有什么帮助。当然,没有现成的、方便的方法来正确地读取N个字节也无济于事。
所以,为了堵住这个漏洞,提高人们的意识,这里有一个正确的方法:
/// <summary>
/// Attempts to fill the buffer with the specified number of bytes from the
/// stream. If there are fewer bytes left in the stream than requested then
/// all available bytes will be read into the buffer.
/// </summary>
/// <param name="stream">Stream to read from.</param>
/// <param name="buffer">Buffer to write the bytes to.</param>
/// <param name="offset">Offset at which to write the first byte read from
/// the stream.</param>
/// <param name="length">Number of bytes to read from the stream.</param>
/// <returns>Number of bytes read from the stream into buffer. This may be
/// less than requested, but only if the stream ended before the
/// required number of bytes were read.</returns>
public static int FillBuffer(this Stream stream,
byte[] buffer, int offset, int length)
{
int totalRead = 0;
while (length > 0)
{
var read = stream.Read(buffer, offset, length);
if (read == 0)
return totalRead;
offset += read;
length -= read;
totalRead += read;
}
return totalRead;
}
/// <summary>
/// Attempts to read the specified number of bytes from the stream. If
/// there are fewer bytes left before the end of the stream, a shorter
/// (possibly empty) array is returned.
/// </summary>
/// <param name="stream">Stream to read from.</param>
/// <param name="length">Number of bytes to read from the stream.</param>
public static byte[] Read(this Stream stream, int length)
{
byte[] buf = new byte[length];
int read = stream.FillBuffer(buf, 0, length);
if (read < length)
Array.Resize(ref buf, read);
return buf;
}